4 12 32 54 78 - fundación general csic€¦ · fernando broncano has a ph.d. in philosophy from...

106
Notebooks of the Fundación General CSIC / Nº 5 / June 2011 / Published quarterly / Price: 9 euros |||||||||||||||||||||||||||||||| 4 Frontier research 12 On matter and energy 32 In the exploration of the universe 54 In the borderlands 78 In studies into society and culture

Upload: phamkien

Post on 11-Oct-2018

214 views

Category:

Documents


0 download

TRANSCRIPT

No

teb

oo

ks o

f th

e Fu

ndac

ión

Gen

eral

CS

IC /

Jun

e 20

11

Notebooks of the Fundación General CSIC / Nº 5 / June 2011 / Published quarterly / Price: 9 euros

05

||||||||||||||||||||||||||||||||

4Frontier research

12On matter and energy

32In the exploration of the universe

54In the borderlands

78In studies into society and culture

LYCHNOSNotebooks of the Fundación General CSIC

Nº 5 JUNE 2011

Executive EditorReyes Sequera

Assistant EditorSira Laguna

Page layouts DiScript Preimpresión, S. L.

Illustration Lola Gómez

TranslationDuncan Gilson (except the article by Peter Wagner, which was submitted in English)

Published by

PresidentRafael Rodrigo Montero

Director Javier Rey Campos

AddressPríncipe de Vergara, nº 9 - 2ª derecha; Madrid 28001

www.fgcsic.es

© Fundación General CSIC, 2011. All rights reserved. Use by third parties of the contents of this journal without the prior written consent of the copyright holder may constitute a criminal offence under intellectual property law.

Printed by: DiScript Preimpresion, S. L.Legal Deposit: M-33022-2010.ISSN: 2172-0207

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 3

01 Frontier research… ........................................................................................... 4The frontiers within. Fernando Broncano .....................................................................6

02 … on matter and energy ...............................................................................1202.1 ITER and fusion energy. Joaquín Sánchez .........................................................14

02.2 The Higgs boson. Álvaro de Rújula ....................................................................21

02.3 M theory (strings). Enrique Álvarez .....................................................................27

03 … in the exploration of the universe ........................................................3203.1 The darkness of the cosmos. Enrique Gaztañaga ............................................... 34

03.2 The exoplanetary bestiary. David Barrado-Navascués ........................................ 40

03.3 Large Astronomical facilities:

Astrophysics on a grand scale. Antxon Alberdi ................................................... 46

04 … in the borderlands ......................................................................................5404.1 The present and future of synthetic biology.

Javier Macía and Ricard Solé ............................................................................56

04.2 Quantum information. Antonio Acín ...................................................................60

04.3 When economics met physiology.

Enrique Turiégano .............................................................................................66

04.4 The millennium problems. Manuel de León ........................................................72

05 … in studies into society and culture .......................................................7805.1 Frontier research in the Humanities and Social

Sciences. Javier Moscoso .................................................................................80

05.2 Language: learning and use. Nùria Sebastián ....................................................85

05.3 Modernity: understanding our present time. Peter Wagner ................................90

06 Forum ...................................................................................................................96Frontier research: bringing the future closer. Javier Rey ..............................................98

07 News .................................................................................................................. 100

CONTENTSLYCHNOS Nº 5 JUNE 2011

01Frontier

research …

6 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

01 FRONTIER RESEARCH … ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

Just over ten years ago John Horgan, a science journalist, wrote a book

with the prophetic title: “The end of science.” In it, Horgan suggested that science had already reached its final fron-tiers and that henceforth all that remained was progres-sively less ambitious and ever more academic and superfi-cial research.

As little more than a decade has passed since the book

was published it is perhaps still too early to give an opin-ion on its predictions. Science is a marathon, a long-distance race rather than a sprint, and it is difficult to extrapolate the trends observed in recent years’ findings to the wider historical scale. However, the question Horgan asked is of interest in itself in other ways, regardless of how we answer it. The very fact of asking about the end of science implies a certain way of seeing

it, or of the way in which sci-ence and technology’s practi-tioners conceptualise their work.

Horgan is part of a long trad-ition conceptualising the enter-prise of scientific knowledge in terms of the metaphor of the frontier. Another work in this tradition which, in view of its enduing impact, cannot be ignored, and which is central to understanding the power of metaphors, is the report that

Vannevar Bush sent to the President of the United States in 1945 with the title “Science: the Endless Frontier”. Vanne-var Bush, an electronic engin-eer, was at the time director of the Office of Scientif ic Research and Development (OSRD), an agency set up during the Second World War to mobilise scientific research for the war effort. It was the agency that directed the Man-hattan project that built the atom bomb. At the end of the

The frontiers withinAccording to the author, the idea of a frontier, as the ultimate boundary the ultimate boundary reached by our knowledge, beyond which great rewards can be expected, has been a persistent metaphor that scientists have used to talk about their activity throughout the history of science. In more recent usage, it conjures up the great wagon trains crossing the plains, suffering all manner of mishaps and perils, but ultimately, with luck, reaching their promised land. The enterprise of knowledge was understood as a strategic undertaking, that is to say, as endeavouring to attain something that was itself no more than a means to a further end: exloitable knowledge.

Fernando Broncano

Universidad Carlos III

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 7

|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| FRONTIER RESEARCH … 01

war, Bush feared that soci-ety’s new forms of organising science, with public support marshalling resources on a huge scale to provide person-nel, infrastructure, institutions, and funding, might end with the war and he wrote his report with the explicit aim that this programme be con-tinued in the future. In the let-ter of transmittal which he sent to President Truman accompanying the report, Bush wrote:

“The pioneer spirit is still vigor-ous within this nation. Science offers a largely unexplored

hinterland for the pioneer who has the tools for his task. The rewards of such exploration both for the Nation and the individual are great. Scientific progress is one essential key to our security as a nation, to our better health, to more jobs, to a higher standard of l iving, and to our cultural progress.”

In this paragraph he brought together what would be the two guiding principles running through the report: First, the basic objective, which was simply to engender a willing-ness on the part of the gov-

ernment to support basic research, and secondly, the rhetorical device used to jus-tify this claim. Here, the meta-phor of the endless frontier invoked in the imagination the idea of riches of all kinds awaiting pioneers who dared to venture out into that distant unexplored land. Science was seen as an enterprise of exploration of an unknown space in which riches were waiting to be discovered, although the effort made might not seem to bring immediate reward. In the imagination of the time were the great wagon trains cross-

Fernando Broncano

Has a Ph.D. in philosophy from the University of Salamanca, and was professor of Logic and the Philosophy of Science at the University of Salamanca until the 1999-2000 academic year. 

His field is the notion of rationality in its theoretical, epistemological and practical facets. In the epistemological area, he has worked on the problems of rationali-ty in science, its cognitive aspects, and the rationality of scientific communities. From this field he moved on to more general problems in the Philosophy of Mind (bounded rationality, collective rationality, rationality and the emotions). In relation to practical rationality, he has mainly focused on the Philosophy of technology: skills, plans, collective design capacity, etc.

He is currently working on the importance of meta-representational capacities in culture and science. He takes a militant stance against the split between scien-tific and humanistic culture, considering experience to define the measure of all human activity. His books include: Mundos artificiales (2000), Fondo de Cultura Económica; Saber en condiciones (2003), Antonio Machado; Entre ingenieros y ciudadanos (2006), Montesinos; and La melancolía del ciborg (2009), Herder.

Science is a marathon, a long-distance race rather than a sprint. It is difficult to extrapolate the trends observed in recent years’ findings to the wider historical scale

Fernando Broncano.

8 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

01 FRONTIER RESEARCH … ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

ing the plains, suffering all manner of mishaps and perils, but ult imately, with luck, reaching their promised land. The quest for knowledge was understood as a strategic enterprise, that is to say, as an undertaking to achieve some-thing that was itself no more than a means to an end: exploitable knowledge. The connotations of the treasures hidden in unexplored lands lie in the deepest of human met-aphors: the journey towards a promised land.

The idea of a frontier, as the final boundary reached by our knowledge, beyond which g re a t r e w a rd s c a n b e expected, has been a persist-ent metaphor that scientists have used to talk about their work throughout its history. Newton, for example, attrib-uted his success to his stand-ing on the shoulders of giants to see further than others. The idea that knowledge consti-tutes a space explains why the suspicion that we have reached the limits of the terri-tory is also used as a rhetoric al device. It is the device used by Horgan, and was possibly also a widely held belief at the end of the 19th century, as is exempl i f ied by the remark attributed to William Thomson, Lord Kelvin, who said that physics had reached its limits and all that remained would be just a matter of add-

ing decimals to the precision of the solutions of the basic equations. Whether true or false, this anecdote exempli-fies the mentality of scientists working within a paradigm, whose limits they inevitably understand as the limits of what is knowable. That the claim of having reached the ultimate limits coincides in time with the great crisis in science that led to relativity, quantum mechanics, mathe-matics of transfinite numbers, population genetics, game theory, etc. is a sort of histori-cal irony and shows that at times the explorers are not quite as farsighted as they think. Metaphors can be illu-minating, but sometimes also leave shadows. The metaphor of the frontier sometimes leads to the belief that the space is being explored only has the two dimensions of a surface. But, as Richard Feyn-man has pointed out, in one of the first calls to what would much later come to be called nanotechnology, there is also a lot of space “down below.”

In fact, the metaphor of the frontier is actually two meta-phors rather than one. For example, among its definitions for the term frontier, the Oxford dictionary includes: 1. the border between two countries; and 2. the borders between settled and unsettled country. It is this second

beginnings. They are the way in which in research is organ-ised into communities of researchers linked by internal bonds of trust and mutual supervision of research. They are made up of “invisible col-leges,” to use the term coined by the sociologist Derek J de Solla Price to refer to the links of respect that organise their activities, and are expressed in citations, in how scientists follow the work done by other scientists, in how their work is steered, and in the education new scientists receive, etc. They also operate at the level of the internal composition of learned societies, the editorial boards of journals, and the peer reviewers judging the quality of other scientists’ work, and in the system of rewards and prizes. In short, in everything that imbues the scientific community with its internal life.

Since the emergence of sci-ence in its contemporary form in the 19th century (or per-haps in the previous century), the disciplines fostered the social division of cognitive labour in science and were an extremely powerful tool in the development of knowledge, constituting themselves as both the organs of knowledge creation and quality control. It was the discipl ines that shaped scientific knowledge as a form of knowledge that is

Science was seen as an enterprise of exploration of an unknown space in which valuable treasures were waiting to be discovered

meaning that has the conno-tations of the pioneer. But the first of these two senses, that of a boundary between coun-tries, refers to another type of border, namely those that limit territories that are already occupied. By whom? In the case of science, the inhabit-ants are the scientific discip-lines.

The disciplines are not mere aggregates of researchers or lecturers. They are the institu-tions that have made up the structure of science since its

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 9

|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| FRONTIER RESEARCH … 01

highly filtered by collective criti-cism, subject to equal meas-ures of creative energy and scepticism in the assessment of findings. As they emerged, the disciplines corresponded to the various domains of real-ity, in line with how the general map of knowledge was charted. In the 19th century, for example, the old rational mechanics, a branch of math-ematics, became the most abstract part of what had been previously called, in view of its speculative nature, natu-ral philosophy, and which at this time began to be called Physics. Something similar happened in the case of Chemistry, Biology, etc. Each of these new macro-disci-plines developed a tree struc-ture of new subdisciplines as robust areas of research were formed (incidentally, the meta-phor of the “tree of know-ledge” is another of the great perennial metaphors of sci-ence. Thus, for example, as part of the iconographic trad-ition of the Spanish National Research Council (CSIC), unlike the more modern meta-phor of the endless frontier, the tree refers to a mediaeval tradition in which the CSIC sought to root its origins).

The discipl ines gradual ly became for science what countries are in the political domain: hierarchical institu-tions able to exert control over

the direction of research. As Thomas S. Kuhn observed, they were articulated around paradigms that mobilised research efforts in directions organised by the principles of the prevailing paradigm. Kuhn also noted that this gave rise to what he called the “essen-t ia l tension” of science: namely the tension between obedience and creativity, between submission to the discipline and the desire to explore the unknown. The dis-ciplines, at the same time as being an instrument of devel-opment, also constrained the creative imagination of many new researchers who wanted to explore the greyer areas of each paradigm.

This gives rise to a new form of frontier which has less to do with the confines of the unknown than with grey areas within what is already known. In the 19th century, many internal movements in science began to cross the borders of the established disciplines. Two of the major concepts that articulated the under-standing of nature in the 19th century arose in a trans-disciplinary way. The first was the idea of energy. The discovery of the law of conservation of energy, the starting point for the whole set of natural laws, originated from the conver-gence of research by many different people, including

For its development, the theory of evolution had to bring together researchers belong-ing to fields as diverse as geology and palaeontology, taxonomic biology, develop-ment biology, statistics, and certain other disciplines lent their resources to enable research into this domain out-side the boundary of any of the previous ones.

Contemporary science has explored frontiers in the sense of the boundaries of the unknown with an astonishing drive to explore the remotest corners of reality: exploring the extremes of size in terms of the upper limits (the size of the universe, its mass, deep time in the history of the uni-verse), and lower limits (the fabric of space and time on the quantum scale), and the extremes of complexity (neural systems, computer systems). John Horgan’s diagnosis might seem to be justified if there were no frontiers left and we were waiting for that inev-itable “Theory of Everything” in which all that would remain would be to fill in the details in the maps of the universe. But it would be wrong to think in these terms. As happened in the 19th century, i t was transdisciplinary concepts that bore the most surprising fruit of contemporary know-ledge. Thus, if the 19th cen-tury was the century of

The disciplines are not mere aggregates of researchers or lecturers. They are the institutions that have made up the structure of science since its inception

doctors (Mayer), engineers (Sadi-Carnot), experimental-ists (Joule Faraday), mathe-matical physicists (Helmholtz), and even the natural philoso-phers of German Romanti-cism. The concept of energy itself was a trans-disciplinary concept that encompassed what was common and inter-active to causal systems on all levels of organisation of reality. Another of the great trans-disciplinary principles of the 19th century was that of evolution.

10 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

01 FRONTIER RESEARCH … ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

energy, the 20th century pro-duced another of the trans-disciplinary concepts which is st i l l dr iv ing convergence between disciplines: the con-cept of information. Born in the seemingly distant fields of basic research in electronic engineering, the concept of information has revolutionised all fields of contemporary knowledge, from technology and biology to fundamental physics. Networks and adap-tive systems are the new con-cepts with the power to artic-u l a t e a n d r e o r g a n i s e knowledge.

If we look at science from a distance, from the perspective of its history and its sociologi-cal composition, it is perhaps possible to detect that this second form of life on the frontier has been much more common than it might at first seem. Research on these frontiers is often referred to as “interdisciplinary” research, but this is misleading, as it makes it sound as though it were marginal research in which researchers trained in one discipline, but perhaps not working at its cutting edge (in the sense of the limits of the unknown), devoted their spare time to working with other colleagues, like para-medics acting as impromptu midwives at a premature birth on the way to hospital. That is not the case. In fact, the fron-

novel disciplinary structures of research have emerged at the frontiers of the disciplines in a new transdisciplinary, way, inventing new methods and concepts and mixing the seemingly immiscible. In these new knowledge niches, sci-ence and technology are intertwined and can swap roles. Engineering, ostensibly more applied, turns into basic or even speculative research, such as is the case today with robotics, which is engaged in simulating intelligent minds and bodies, or the biological sciences turn into engineer-ing, as is happening in basic research in computer science. It is in these no-man’s-lands that fields such as the cogni-tive sciences, robotics, neuro-science, or exobiology, have been created.

As well as the many practical reasons to look more closely at the borderlands between disciplines, there are also many powerful theoretical rea-sons. We have always viewed concepts as stable structures defined by the theories (scien-tific or philosophical) that, if they change, do so in step with these structural theories. But concepts, like human cre-ativity, are much more trans-gressive of these boundaries than it might seem. Concepts defined in one discipl ine migrate to others, or turn into promoters of ideas remote

from the conditions in which they emerged as a means of classifying a piece of reality, thus turning into drivers of creativity. In the humanities, this conceptual liberty is well established and at t imes causes considerable irritation as it seems to transgress the frontiers of the rational (a case in point is that of the invec-tives by the physicists Alan Sokal and John Bricmont in 1998 against post-modernist philosophers who used con-cepts, or a least terms, taken from physics metaphorically without concerning them-selves with their scientific meanings). There are, cer-tainly, good reasons for being irritated by the lack of concep-tual clarity, but one should stop and think whether sci-ence itself, looked at from a distance, rather than in the harsh light of its day-to-day work, has not itself made such transgressions more often than it might at first seem. Newton used bars and strings to build physical models of his equations; Maxwell used rollers, vortices and cells to imagine the ether; Planck imagined resonators and technological mechanisms when trying to solve the prob-lem of the black body. All of them, perhaps unwittingly, stood on metaphors to help them peer into the unknown. They too were transgressors of boundaries.

The disciplines fostered the social division of cognitive labour in science and were extremely powerful tools for the development of knowledge acting simultaneously as both organs of creation and quality control

tiers between the disciplines are home to a great deal of scientific life and it is a life that, l ike all l ife at the frontier, involves much more creative energy than it is usually cred-ited with. It is true that at times disciplines are subdivided into subdisciplines, but the most

The “Workshops FGCSIC" initiative comprises a series of topic-focused workshops for training and debate. They represent an opportunity for high-level meetings between researchers, entrepre-neurs, professionals and others with an interest in various aspects of R&D and innovation. With a streamlined and innovative format, they aim to encourage the exchange of knowledge, by promoting a participative and constructive environment.

Workshops FGCSIC are free, but participant numbers are limited.For more information, see www.fgcsic.es/workshops

Workshops FGCSIC 2011Technology watch and competitive intelligenceStrategic Planning of R&DScientific foresightScience and technology output metrics

02… on matter and energy

14 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

02.1 … ON MATTER AND ENERGY |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

Fusion as an energy sourceHumanity current ly con-sumes about 10 terrawatts (TW) of power, equivalent to the output of 10,000 large nuclear power stations. The figure alone is staggering, but the situation becomes even more worrying when we look more closely at some of the details. First of all, much of this energy is still produced from burning fossil fuels on a massive scale, including oi l at a rate of almost 90 million barrels a day, with all the problems

ITER and fusion energy

Fusion energy could play an important role as a large-scale energy source in the second half of the century. Its advantages include the fact that it is environmentally friendly, and its raw materials are abundant and widely distributed around the globe. ITER, the International Thermonuclear Experimental Reactor, is the tool with which scientists hope to obtain energy from nuclear fusion.

Joaquín Sánchez

Centro de Investigaciones Energéticas, Medioambientales y Tecnológicas (CIEMAT)

that this entails. Secondly, consumption is far from uni-form, being much higher in the developed world. This makes it conceivable that this f igure of 10 TW wi l l increase significantly over the coming decades as developing countries raise their l iving standards and their energy demand rises accordingly.

Faced with this major prob-lem, technology offers a range of potential solutions, one of which is fusion energy. This

technology may come play an important role in the second half of the century as a large-scale source of energy, given that it is environmental ly friendly and based on an abundant material found all around the globe.

Fusion is the process by which nuclei, usually small ones, combine to create somewhat larger nuclei with a loss of mass in the process. This mass is transformed into the kinetic energy of the resulting particles and nuclei

in accordance with Einstein’s formula (E=mc²).

The Sun obtains its energy through a series of reactions that start with the fusion of two hydrogen nuclei to pro-duce deuterium. This reaction takes place due to the power-ful gravitational force at the centre of the star, which com-presses hydrogen to densities of 1032 protons per cubic metre at temperatures of 1.5 KeV (16 million degrees Kelvin). In the absence of this power-ful gravitational force, it is

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 15

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … ON MATTER AND ENERGY 02.1

impossible with our current capabilities to achieve this reaction between two pro-tons. Therefore the goal of research into fusion as a source of energy is focused on another reaction with a much larger cross section, namely fusion of a deuterium nucleus with a tritium nucleus (the D-T reaction) to produce a helium nucleus and a neu-tron, with a combined energy of 17.5 MeV (Figure 1).

The raw material for the D-T reaction is extremely abun-dant in nature and is distrib-uted throughout all the regions of the world. Normal water contains 33 mg of deuterium per litre and tritium can be p roduced f rom l i t h ium,

another very abundant elem-ent, which can be obtained in several ways, including extraction from sea salt. One big advantage is that the amounts required are quite modest, so the potential impact of the price of lithium on the final price of energy would be very small. With a per capita consumption like today’s, mankind could obtain all the energy it needs using a gram of lithium per person a year. On the other hand, the reaction product, helium, is one of the most innocuous elements that exists (among other things, it is used to fill children’s balloons), it does not cause a greenhouse effect, and does not accumu-late in the atmosphere. And in

Joaquín Sánchez

Has a Ph.D. in Physics from the Madrid Complutense University (1986), after completing which he joined the Spanish national energy, environment and tech-nology centre (CIEMAT) as an expert in plasma diagnosis.

After two years at the Max Planck Institute in Garching (Germany) he became manager of diagnostic systems in the CIEMAT’s successive magnetic confine-ment fusion devices: TJ-I, TJ-IU and finally TJ-II. Since June 2004 he has been director of the Spanish National Fusion Laboratory at CIEMAT.

He has worked with the Max Planck IPP (Germany), Oak Ridge National Labora-tory (USA), Princeton University (USA), the National Institute for Fusion Science (Japan) and the Massachusetts Institute of Technology (USA). Between 2000 and 2003 he was task force leader for scientific operation of diagnostic systems at the European Union’s JET experiment in Culham (United Kingdom). He is currently re-search coordinator for the Consolider Fusion Technology project and chairs the Fusion Technology Platform, an organisation aimed at promoting the participa-tion of Spanish industry in the ITER fusion project.

He is Spain’s representative on the governing council of the ITER Fusion for Energy joint undertaking in Barcelona, vice-president of the EURATOM consult-ative committee on fusion energy and chairman of the group of chairpersons of the European fusion programme.

With a per capita consumption like today’s, humanity could obtain all the energy it needs using a gram of lithium per person a year

any event, fusion energy would produce gases at a rate of around 6,000 tonnes a year, which is a very modest quantity when compared with the 1010 tonnes a year of CO2 currently emitted into the atmosphere.

So far this presentation has focused on the advantages of this energy source, but what about the difficulties? The big-gest challenge is achieving the reaction itself. In order to fuse two nuclei they first have to be brought extremely c lose together. To do so, the elec-trostatic repulsion between them has to be overcome so that they are sufficiently close for the attract ion of the nuclear forces to come into

Joaquín Sánchez.

16 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

02.1 … ON MATTER AND ENERGY |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

play. One way of doing this is to collide them at high speed, which can be done by accel-erating the deuterium and tri-tium to energies of 20 KeV. Achieving these speeds is not a problem, indeed particles are nowadays accelerated to energies mill ions of times greater, but the energy effi-ciency of the process is an issue. Even if we accelerate some nuclei towards each another, the electrostatic or coulomb repulsion makes head-on co l l i s ions very unlikely, so most of the par-ticles will cross each others’ paths without colliding and fusion reactions will not take place.

The only way to achieve a reaction rate that makes the process cost effective in energy terms is to keep the

deuterium-tritium gas con-fined, so that the nuclei are colliding continuously while ma in ta in ing an average energy of 20 keV. The scale of the problem becomes appar-ent when we think about the kind of vessel we need in order to keep the gas in a state of thermal agitation, as 20 KeV per particle means a temperature of 220 million degrees.

Interestingly, this serious prob-lem is also the source of one of the big advantages of nuclear fusion. The need for such h igh temperatures makes i t inherent ly safe because, as soon as the opti-mum operating conditions are degraded by the failure of any of the systems, the reactor becomes will be unable to sustain these high tempera-

tures required and the reac-tion will shut down automati-cal ly leaving vir tual ly no residual heat.

Historically two types of “con-tainers” have been used to tackle the temperature prob-lem. The first option is simply to heat the gas suddenly for a few nanoseconds so that the reaction takes place before it has time to expand freely. This principle underlies the method of “inertial confine-ment” method, the main exponent of which is the recently started NIF experi-ment in Livermore (USA). The second option is to use the fact that at such high tempera-tures the gas is in the plasma state, so is composed of charged particles, which can therefore can potentially be trapped by a magnetic field. This “magnetic confinement” approach is the one the European fusion programme has committed itself to and it underlies the ITER experi-ment.

Magnetic confinement: the ITER projectThe magnetic field cannot alter the charged particles’ energy, but it can deflect their paths, so that they travel along a helix around the field line. In the case of particles with energies of 10 KeV and fields of the order of several tesla the radius of these heli-

Nuclear fusion is inherently safe because as soon as the optimum operating conditions are degraded by the failure of any of the systems, the reactor will be unable to sustain the high temperatures required so the reaction will automatically shut down leaving virtually no residual heat

Deuterium Helium

Fusion

Energy

Tritium Neutron

// Figure 1. Schematic representation of the D-T fusion reaction //

Source: Figure courtesy of the author.

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 17

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … ON MATTER AND ENERGY 02.1

ces is a few centimetres for ions and a few millimetres for electrons. If we make the field line close up on itself in a “toroidal” geometry we can keep the particles trapped.

The “tokamak” (from the Rus-sian toroidalnaya kamera ee magnitnaya katushka meaning toroidal chamber and mag-netic coil) is a magnetic trap that performs exactly this task. However, while the mag-netic field slows the escape of energy from the tokamak, it does not stop it completely because the co l l i s i ons between the particles inevi-tably produce diffusion. It is therefore necessary to resort to a second parameter to maintain energy: size. Thus the bigger the tokamak, the more efficient it will be. Cur-rently, the world’s biggest tokamak is the European Union’s JET project in Oxford. It has managed to produce fusion power of up to 16 MW and can contain 80 m3 of plasma. It is 12 m tall overall and has a similar diameter. The next step in the develop-ment of fusion energy will be the big ITER experiment which will have 1,000 m³ of plasma and, according to the laws of scale derived from the accumulation of previous experiments, energy gains of between 5 and 10 t imes should be achieved, demon-strating what we might call the

“scientific feasibility” of fusion as an energy source.

ITER is a large tokamak that is being built by a group of coun-tries that are home to over half the world’s population: China, Korea, the USA, India, Japan, Russia, and Europe (which is participating as a single part-ner). The experiment is located in Cadarache, in the South of France. It is a large-scale machine based on supercon-ducting coils. To give an idea of its complexity, in the space of less than two metres there will be a transition from tem-peratures of a hundred million degrees in the plasma to almost absolute zero in the superconducting coils, which operate at 1.4 K. Another example is with the complexity of the computer aided design (CAD) that has to handle 10 million different parts, 10 times more than used in the Airbus 380 (Figure 2).

Construction of this huge project began in 2008 and experiments are expected to begin in 2019.

The technological challengesAt the start of this article I mentioned that the tritium needed for the D-T reaction can be produced inside the reactor from lithium. This will be done by a system wrapped around the plasma called the breeding blanket, which will

a lso be respons ib le for extracting the energy from the neutrons and avoiding them reaching the superconducting coils. The reactor is brought into operation supplied with an initial small amount of trit-ium from external sources and as the fusion reaction develops, each fused tritium atom emits a neutron. The neutrons coll ide with the material in the blanket, pro-ducing several addit ional lower energy neutrons. These secondary neutrons will be those which collide with the lithium to generate more trit-ium. This intermediate neu-tron multiplication is essential to maintain a sufficient rate of tritium renewal, i.e. slightly above 100%.

ITER will not have a complete breeding blanket but a pro-gramme of tests of various designs will be run using small sections of breeding blanket (2x1 m²) attached to the plasma access doors.

The other major challenge for fusion, once the problem of plasma confinement has been solved, arises from the mater-ials.

The first problem with the materials is caused by the charged particles escaping from the plasma and striking the reactor walls. As explained above, the magnetic field

ITER is a large tokamak that is being built by a group of countries that are home to over half the world’s population: China, Korea, the USA, India, Japan, Russia, and Europe (which is participating as a single partner)

18 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

02.1 … ON MATTER AND ENERGY |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

slows the particles down but does not prevent them escap-ing: the outer wall of the reac-tor ends up receiving an intense flow of charged par-ticles with energies typically of 50 eV. The problem with this flow of particles is that depos-ition is normally concentrated in specific areas of the machine, leading to local thermal loads of 20 MW/m², which is suffi-cient to vaporise almost any known material. This will be a lesser problem for ITER, in

wh ich the componen ts a f fected are due to be replaced regularly, but it may be a serious limitation for a commercia l reactor. The materials being considered for this purpose are carbon fibre and tungsten compounds, in both cases cooled by helium, and as a long-term solution, walls based on liquid metal, basically lithium.

Meanwhile, the neutrons gen-erated in the D-T reaction,

which carry 80% of the energy, do not represent a thermal load problem, as they are distributed isotropically and absorbed uniformly by the whole bulk of the breeding blanket. However, they cause a damage of a different kind to the structural materials sur-rounding the reactor: The neutron impacts cause inter-stitial bubbles of helium and hydrogen to form and dis-place the atoms in the crystal lattice of the material. These effects make the material frag-ile and degrade its structural properties. The ITER will be almost free from this problem as the rate of conversion of neutrons will be small, but in a commercial reactor each atom of structural material will suffer an average of 20 to 30 displacements (dpas) per year of operation.

Additionally, the collisions between these high energy neutrons and the material in the reactor produces radio-active isotopes that were not originally present. This means that at the end of the reactor’s l i fe, the materials wil l be classed as radioactive waste, which while not highly active, can nevertheless not be re cycled or disposed of in conventional landfills. Studies are currently underway –with encouraging results– on so-called “low activation mater-ials” which include certain

types of steel and silicon car-bide composite materials. These materials still generate wastes but they remain radio-active for less than 100 years, after which time they could be reused, meaning that fusion would not leave behind envir-onmental liabilities for future generations.

The IFMIF and DEMO projectsITER will be a big step for-ward, but inevitably we need to ask about the steps that need to be taken after ITER or in parallel with it to arrive at commercial fusion reactors. Apart from the related parallel programme in physics and technology already men-t ioned, const ruc t ion o f another big experiment is being talked about: IFMIF (International Fusion Materials Irradiation Facility). This will be a neutron source produc-ing the high energy netrons (14 MeV) necessary to per-form low activation materials tests.

Around 2030, drawing on the results of ITER, IFMIF, and the physics and technology pro-grammes, the construction of the next big fusion experi-ment will be addressed. This is currently referred to with the generic name of “DEMO” (for demonstration reactor). The main differences from ITER will be that the design

/// Figure 2. General view of the ITER experiment ///////////////////

Source: www.iter.org

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 19

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … ON MATTER AND ENERGY 02.1

will minimise downtime for maintenance so that it can be in continuous operation, 24 hours a day 7 days a week; low activation materials will be used; and it will aim for self-suff ic iency in tr i t ium (using a breeding blanket). Finally, DEMO, which should come into operation around 2040, would have an energy extraction system that would feed a grid-connected power plant.

Given DEMO’s proximity to the commercial phase, it can-not be ruled out that even if the cost exceeds that of ITER there might be several DEMOs in parallel, promoted by indi-vidual countries or groups. We should not lose sight of the fact that the energy market is worth several trillion (1012) euros a year, hundreds of times the overall cost of the ITER project.

Spain’s role in the fusion programmeThrough its national energy, environment and technology centre, CIEMAT, Spain has been taking part in the Euro-pean fusion programme since 1980 and is doing so in two ways: confinement systems, with the tokamak TJ-I (1983-93) and the “stellarator” TJ-IU (1993-97), on the one hand, and low irradiation materials, on the other. In 1998 the TJ-II device –Spain’s big commit-

ment in the fusion field– came into operation. The TJ-II is a “stellarator”, a toroidal geom-etry confinement system but with a design philosophy that differs from that of the toka-mak. Despite being more complex and more expensive to build than a tokamak of similar size, a stel larator offers the possibility of steady state operation (the tokamak is an intrinsically pulsed sys-tem) and with greater stability, which makes this configura-tion in an ideal candidate to use in future commercial reactors. There are stellara-tors operating in several countries: Spain (1), Japan (2), Russia (1), the US (1), Australia (1) and Germany (1). Germany is currently building the W7X superconducting device, which is due to come into operation in 2040, and wil l be the biggest in the world.

With the start of construction of ITER, the Spanish program has been adapting to enable it to participate and play a bigger role in fusion technol-ogy development. As a result, through international consor-tia, Spain is involved in many of the subsystems of the ITER project (instrumentation and measurement, breeding blanket modules, remote maintenance, control sys-tems, etc.) and the IFMIF project. Other important

components of the Spanish programme inc lude the launch of the ICTS “Techno-fusion” project, focused on the development of materials, liquid metal applications and remote maintenance sys-tems, and the Consolider “fusion technology” project, aimed at developing breeding blanket technology.

One important outcome of the existence of a solid Span-ish fusion programme was the ability to compete at Euro-pean level for the site of the ITER project. The quality of the bid, with Vandellós as a possible site, made it possible to pass al l the technical examinations successfully. In the end, the political agree-ment reached in late 2003 gave the site to Cadarache (France), but Spain was selected to host the technical

office for contract awarding and monitoring, located in Barcelona, employing 300 people and with a budget of almost €6 billion during ITER’s construction.

And finally, the excellent role of Spanish industry in the con-struction of ITER should also be mentioned. Spain is sec-ond, after France, in terms of the number of bids submitted to calls for tender, and third, after Italy and France, in terms of total budget allocation. These figures are even more significant if we bear in mind that one of the contracts held by a Spanish company is for the construction of large toroi-dal field superconducting coils, the true “core” technol-ogy of the project, which was awarded to a Spanish-Italian consortium, led by Iberdrola Ingeniería.

The Spanish TJ-II stellarator, located at CIEMAT. Source: CIEMAT

Lychnos 210x240 ing.fh11 2/2/11 12:17 P�gina 1

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 21

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … ON MATTER AND ENERGY 02.2

The standard modelIt was 124 years ago that Michelson and Morley pub-lished their landmark experi-ment. Their aim was to meas-ure the movement of the earth in relation to the Ether, the Newtonian interpretation of the vacuum as the framework of absolute space. Over a century later, we are still trying to under-stand the vacuum, using equip-ment such as the complex net-work of accelerators at CERN (Figure 1) and several of its detectors, such as CMS and ATLAS (Photo 1), which are no mere desk-top experiments. Michaelson and Morley’s failure would end up supporting the

The Higgs boson

Álvaro de Rújula

CSIC, UAM, CERN and Boston University

CERN’s Large Hadron Collider (LHC) is, among other things, a vacuum-shaking machine. Its main goal is to find the Higgs boson, or prove that it does not exist. Generally speaking, it is impossible to prove that something does not exist. However, the success of Michelson and Morley’s experiment was its failure. Likewise, the greatest success of the LHC would be to prove that the Higgs boson does not exist. This would put us in a prerevolutionary mindset, perhaps analogous to that at the dawn of the 20th century.

Einsteinian vision: the vacuum is not a stage upon which things exist or move. On the contrary, it is these things them-selves that configure the space-time in which they exist.

Predictive “relativistic quan-tum mechanics” –so-called theories of “renormalisable” fields – have the particularity that, in them –like under a dic-tatorship– what is not prohib-ited is obligatory. For the the-ory to prohibit a process that has not been observed it must be incompatible with one of the theory’s symmetries. Oth-erwise the “quantum” correc-tions (in successive powers of

the Planck constant) would generate i t , and what is worse, with an amplitude that is unpredictable.

The standard model is the theory on which our under-standing of elementary par-ticles and their interactions is based. Ab initio, one of its symmetries, mysteriously referred to as “gauge”, pro-hibits particles from having non-zero mass. This characterises the great majority of particles: only photons and “gluons” have zero mass. The sym-metry cannot be broken at will: the theory loses its predictive power, acquiring an infinite

number of arbitrary param-eters. In physicists’ inscrutable liturgical language, the theory ceases to be “renormalisable”.

To escape this impasse the break in symmetry has to be “spontaneous”. For instance, imagine a pellet at the bottom of an upright test-tube. Its position is both symmetrical (it is sitting on the axis) and stable (it is at the bottom of the test-tube). Now let’s suppose we heat the test tube and deform its base until it is shaped like the bottom of a bottle. Let’s also imagine we are able to complete this operation with sufficient skill

22 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

02.2 … ON MATTER AND ENERGY |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

that the ball remains on the central axis. Its position is now symmetrical but unstable. To be stable it needs to drop in one or other direction; this breaks the original symmetry; one d i rect ion has to be preferred, although it could have been any other. This is like a “Heisenberg” electro-magnet in which the spins on the atoms interact with one another and, below a critical temperature, align spontan-

eously. The theory has a sym-metry (all directions are equiva-lent) although its least energy solutions (the cold magnet) point in a spontaneously gen-erated direction.

The Higgs mechanismThe gravitational energy or the energy of the interaction between the spins referred to in the preceding paragraph are described as interact ion “potentials”. The standard

model includes the potential of a hypothetical scalar field (with zero spin): the Higgs field. In the minimum energy state (the vac-uum) the field breaks gauge symmetry and one of its com-ponents, which is neutral, takes on any non-zero constant value (its value in a vacuum). The “broken” theory includes a vacuum which is not empty, but permeated by a “substance”: namely a constant field. Unlike the Ether, however, this field is invariant in relativistic terms (it is the same for one observer as for others moving relative to him) and even invariant in gen-eral-relativistic terms (the value of the field in a vacuum is not diluted, even though the uni-verse is expanding).

Once the scalar field is added, everything in the preceding para-graph “happens” to the stand-ard model, but also –inevitably– other things happen too. Particles with a spin of ½ (such as electrons and quarks) acquire mass, as their inter-action with the vacuum –which is no longer a vacuum– now has the form (unique and inevitable) of a mass “term”. Something even more surprising happens to the “intermediate bosons” (W+ , W- and Z0), which are particles with a spin of one which mediate weak inter-actions, and are responsible, for example, for natural radioactiv-ity. Initially, they have zero mass, and like the photon, two polar-

The standard model is the theory on which our understanding of elementary particles and their interactions is based

ALICE

TT10

TT60

TT2

neutrons

LINAC 2 CTF3

e–

LINAC 3 Ions

p p

p

TI2

pTT40 TT41

neutrinos

Gran Sasso

TI8

CMS

ATLAS

LHC

LHCb

2008 (27 km)North Area

East Area

SPS

CNGS

1976 (7 km)

2006

BOOSTER1972 (157 m)

AD1999 (182 m)

n-ToF2001

PS1959 (628 m)

LEIR2005 (78 m)

ISOLDE1989

/// Figure 1. The “complex” CERN accelerators complex, in the best Swiss watch-making tradition ///////////////////////////////////////////////////////////////////////////////////////////////////////////

Protons (heavy ions) are initially accelerated in the LINAC 2 and Booster (LINAC 3) accelerators. They are transferred to PS8, which is a 62-year-old veteran). From there they proceed to the SPS, always accelerating around each ring. Finally, they are transferred to the two 27-kilometre-diameter rings of the LHC, where they are accelerated further and collided.

Source: CERN.

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 23

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … ON MATTER AND ENERGY 02.2

isations states. Once gauge symmetry has been broken, they acquire mass, which requires a certain state of polar-isation or “degree of freedom.” They inherit this degree of free-dom from the three other com-ponents of the Higgs field, which initially has four, –two electrically charged and two neutral– only one of which sur-vives in the field which “fills” the vacuum.

Admittedly all this sounds a bit of a fairy story, but it isn’t: the standard model describes our observations with impressive accuracy. Its most thoroughly tested part is “quantum elec-trodynamics”, which describes how photons and charged particles interact (electric charge is the capacity to emit or absorb photons). It is pos-sible to predict, for example, the gyromagnetic ratio of the electron, i.e. the intensity with which it behaves as a point magnet. Observation and the-ory are both increasingly pre-cise and today match up to no less than 14 significant fig-ures. Strong (between quarks) and weak interactions have not been measured to such a high degree of accuracy, but sufficiently accurate to sug-gest that we are on the right track. The M(W)/M(Z) mass ratio where M is mass and W and Z intermediate bosons, is predicted by the Higgs mech-anism I have just outlined.

The Higgs bosonThe edifice of the standard model is lacking one crucial observation: that of the “Higgs boson.” To recap, a boson is a particle with integer spin, like the photon, which has a spin of one. A fermion (such as the electron) is a particle with a half-integer spin (½ in the electron’s case). Fermions are asocial: there can be only one of them in a given quantum state. Bosons, are the oppos-ite. Like football fans, the bigger the number in the same state, less energy it takes to add more of them.

When any substance is shaken it vibrates. The vibrations are electromagnetic fields, for example light. At the elemen-tary level, the vibrations are quanta (or particles): photons in this case. If there is a sub-stance –the Higgs field– which permeates the vacuum, we

could also “shake” it with suffi-cient energy to create its corre-sponding quanta (the Higgs boson), whose mass we do not yet know.

To be fair, the Higgs boson should be called the Peter Higgs, François Englert and Robert Brout boson, and per-haps with a few more names added besides. Particle physi-cists have missed the chance to give it a better name; some of the attempts, which I shall not mention here, are pathetic. The tradition is to give par-t icles absurd names l ike “quarks” and theories bom-bastic but misleading names like “quantum chromodynam-ics.” Personally, I think we should have gone back to an older tradition and called the Higgs boson the “kenonon,” from the Greek word “kenon” for a vacuum. Instead, unpoet-ically, we usually call it the

“Higgs,” which is a bit like call-ing it the “thingamabob.”

The LHCCERN’s Large Hadron Collider (LHC) does just what it says, it collides hadrons, which are particles composed of quarks or a mixture of quarks and anti-quarks and whose “chromody-namic” interactions are strong interactions mediated by gluons. Protons and neutrons and atomic nuclei are hadrons, P=(uud), N=(udd), where u and d are up and down quarks (as usual, ‘up’ and ‘down’ do not have their normal sense here).

The LHC has been built to col-lide a variety of hadrons ranging from protons up to lead nuclei. At the moment the energy of the protons is 3.5 terra-electron volts (TeV), around 3740 times their energy at rest, mc2. This is sufficient to beat the previous record, held by the Fermilab

Alvaro de Rújula

Has a Ph.D. in Theoretical Physics from the Madrid Complutense University, where he later taught. He has also taught at the Institut des Hautes Études Sci-entifiques (IHES) in Paris and at Harvard University. Since 1977 he has been part of the team at the European Organisation for Nuclear Research (CERN) and has taught at Boston University since 1985. He works with the physics Nobel prize-winner Sheldon Glashow.

Elected a member of the Academia Europaea in 1991, he has been a member of the Institute of Theoretical Physics UAM/CSIC and the Spanish national energy, environment and technology centre (CIEMAT) since 2009.

As a theoretical physicist de Rújula has worked on several key issues in this disci-pline, including aspects of the internal structure of the atom, cosmology and as-trophysics. He is one of the lead investigators of the CERN team launching the Large Hadron Collider (LHC).

Álvaro de Rújula.

24 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

02.2 … ON MATTER AND ENERGY |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

collider (also near Geneva, but Geneva, Illinois) by a factor of 7. The performance of a collider, in terms of the number of colli-sions it produces, is character-ised by its luminosity. The LHC is progressing rapidly towards its design luminosity, but at the moment is only operating at half its maximum energy.

The machine consists of two rings which guide and accel-erate particles in opposite directions and collide them at four points (see Figure 1). Pro-tons reach the LHC partially accelerated along a tortuous series of “old” accelerators. At the LHC’s collision points there are a couple of small experiments (TOTEM and LHCf) and four medium-sized ones: ATLAS (Photo 1), CMS,

LHCb and ALICE. I won’t make the now inevitable mis-take in the case of the LHC of calling them “big”. In reality we are always planning some-thing bigger.

The hunting of the HiggsMost particles are unstable and at first sight, useless. To study them, first you have to create them. This requires col-lisions with an energy (at the “centre of mass”) higher than the particle in question’s energy at rest. For particle physicists this is easy to say, as we do not d ist inguish between energy and mass. For exam-ple, in the formula:

E = mc2 / √(1 – v2 / c2), we sim-ply choose the units so as to set the speed of light c =1. If we

measure time in nanoseconds and distances in units such as “my left foot,” c = 1 because that foot (around 30 cm) pre-cisely measures (by definition) the distance light in a vacuum travels in a nanosecond. By the way, the famous equation E = mc² is actually wrong. That is to say it is valid only for an object at rest, in which case it is virtu-ally a tautology.

Among other things the LHC is a VSM (vacuum-shaking machine) as its main task is to find the Higgs boson or prove that it does not exist. Generally speaking, it is impossible to prove that something does not exist. This case is an excep-tion. With a Higgs of mass greater than 1 TeV, the stand-ard model would violate “uni-tarity”, which is to say that it would predict processes with probabilities of over 100%, which is an absurdity. The LHC is sufficient to manufacture objects with a mass of 1 TeV so if it does not find the Higgs after a few years of looking it means it does not exist. Or that it has properties signifi-cantly different from those pre-dicted, in which case it would not be “the” Higgs boson.

The most likely process for the production of the Higgs boson is that shown in Figure 2. Interestingly, it would involve a “virtual tr iangle” of “top” quarks. The reason is that this

is the most massive of the quarks, and the Higgs boson attaches to particles in pro-portion to their mass: this is how the Higgs field in a vac-uum generates mass.

The process of disintegration of the Higgs boson that best allows us to measure its prop-erties –and find out if it has the identity ascribed to it– is also shown in Figure 2. The reason is that the electrons or their unstable heavier “replicas”, the muons, are particles whose energy and direction can be measured with greatest preci-sion, enabling the spin and “coupling” of the Higgs boson to be determined well.

CodaThe “cosmological constant” (L), the only static entity that can exercise a gravitational repulsion –Einstein’s old idea that he repudiated when he realised that the universe is not static– has come back (posthumously) with a ven-geance. Einstein introduced it to stabilise the universe artifi-cially against the gravitational attraction between galaxies, that would otherwise cause it to collapse in on itself. The universe is expanding and its expansion is accelerating. This implies that L is non-zero, as we thought until recently, without knowing why. All the data agree with the possibility that this L is

Photo 1. Partial view of ATLAS, one of the LHC’s detectors, at its underground collision point during assembly. Atlas is 25m high and 46m long. It weighs 7,000 tonnes –almost as much as the Eiffel Tower. The detector includes 3000 km of cables and has 100 million electronic channels and countless components using a variety of cutting-edge technologies. Source: CERN.

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 25

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … ON MATTER AND ENERGY 02.2

responsible for the accelera-tion observed.

In current terminology, L is the density of energy in the vac-uum. We understand the standard model well enough to be able to estimate how much the Higgs field contrib-utes to the potential difference –energy density between the false unstable vacuum (the ball perched on the tip of the bottle bottom) and the true vacuum (the ball at the bot-

tom). Gravity responds directly to the energy density (and momentum) and not, like other forces, only to differences in potential. The contribution of the Higgs field to the cosmo-logical constant should be of the order of the potential differ-ence between the false and true vacuum. But the result is an estimate some 54 (no less) orders of magnitude larger than cosmological observations. A contradiction on this scale is not without its merit.

Perhaps we can explain why particles have mass, but not why they have the mass they have. We have seen other flaws in our understanding of the vac-uum and a field that supposed ly permeates it. We want to know more, and that is why we are furiously searching for the Higgs boson, or something that looks like it. Or doesn’t.

The success of Michelson and Morley’s experiment was its failure. Likewise, the greatest success of the LHC would be to prove that the Higgs boson does not exist. This would put us in a prerevolutionary mind-set, perhaps analogous to that at the dawn of the 20th century. I would rather not be the one who has to explain to the competent authorities (who, of course, always like to have the last word) that the biggest dis-covery would, in this case, be to not find what we are supposed to be looking for.

Appendix: a global under-taking CERN has around 2,300 employees, 20 of whom are the-oretical physicists on contracts lasting three years or more. There are also around 7,000 “users”, including students, sci-entists, engineers and techni-cians from around 70 countries, many of which, such as Switzer-land and Spain, are multilingual. In CERN’s polyglot cafeterias a seasoned linguist could detect

hundreds of languages or dialects spoken in countries such as Italy or France, all coexisting peacefully. In total, around 330 Spaniards are working on the LHC’s experi-ments.

The CERN’s budget, paid for by its twenty-plus member countries in proportion to their GDP (with a ceiling of 20% of the budget for Germany), is approximately a billion Swiss francs a year, to which Spain contributes about 7%. This means that Spaniards contrib-ute about a euro a year each.

It is difficult to overestimate the impact of basic research on the Economy, Technology or Medi-cine over the long term, although all too frequently it is judged on a timescale closer to the electoral cycle. I will limit myself to the customary example, although it was no fluke: Hypertext Transfer Protocol (the mysterious “http” of website addresses) was invented at CERN by Tim Bern-ers-Lee. The protocol laid the foundations for the explosion of the Internet. Berners-Lee, with the initially sceptical approval and indispensable academic freedom given him by his super-iors, set about finding a universal language with which groups of physicists could exchange data regardless of the type of compu-ter they were using. And in a borderless world like that at CERN, he hit the mark.

P

tg

Z

H

e–

µ–

e+

µ+

Z

g

P

P

/// Figure 2. Dominant process for production of a Higgs Boson, H, from the collision of two protons, P. ///////////////////////////////

The fine coloured lines are quarks. The double spirals are gluons (g). t denotes a top quark loop. The two Zs are intermediate boson and the black lines an electron, a positron, a muon and its antiparticle.

Source: CERN.

26 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

… ON MATTER AND ENERGY ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

Antiparticle. Every subatomic par-ticle in nature has a corresponding antiparticle, with the same mass and spin, but the opposite electri-cal charge. Some particles are iden-tical to their antiparticle, such as the photon. However, not all uncharged (electrically neutral) particles are identical to their antiparticles.

Antiquark. The antiparticle of a quark. Matter comprises the same number and types of quarks and antiquarks. They are represented with the same symbol but with a bar over the letter. For example, if a quark is represented with a u, its antiquark would be u–.

Boson. One of the basic types of elementary particle, with integer spin (the other group is the fermi-ons, which have hal f- integer spin).

Brane. Neologism short for multi-dimensional membrane, where a p-brane is a multidimensional mem-brane, or a sub-space of a larger space, with p dimensions. Thus, a 2-brane is an ordinary membrane (a two-dimensional surface).

Colour charge. A property of quarks and gluons that quantifies their capacity to emit or absorb gluons, the particles which mediate the strong interaction in quantum chromodynamics (QCD).

Coupling constant. Quantity that determines the strength of an inter-action. An example is electric charge in the case of photons inter-acting with charged particles.

Dilaton. Hypothetical particle which appears in string theory.

Field. Basic entity describing ele-mentary particles and their inter-actions. The gravitational field makes the apple fall from the tree and holds the moon in its orbit. The electromagnetic field medi-ates the electromagnetic forces. Quantum f ie lds have a dua l behaviour: their vibrations are par-ticles (photons) and waves. A rel-

ativistic quantum field is one that satisfies the corresponding –well confirmed– theories.

Gauge field. Gauge fields are a generalisation of electromagnetism describing fundamental interac-tions. The colour field (quantum chromodynamics) whose associ-ated particles are the eight gluons, describes strong interactions, whereas the electroweak field (whose associated particles are the W^+, W^-, Z^0 bosons and the photon), describes the electromag-netic and weak interactions.

Gluon. A boson which acts as the exchange particle in the “chromo-dynamic” or strong interaction, one of the fundamental forces. They have no mass or electric charge, but do have a colour charge, so as well as transmitting the interaction they are affected by it.

Graviton. Hypothetical boson-type elementary particle that transmits the gravitational interaction in most quantum models of gravity.

Heisenberg uncertainty princi-ple. The dual nature of the “fields” that describe the “wave” and “parti-cle” aspects of objects such as electromagnetic radiation impose restrictions on our ability to simulta-neously determine their position and speed, or other observable “complementary” pairs.

Higgs boson. An as-yet unob-served elementary particle needed to successfully complete the stand-ard model of particle physics. Find-ing it is crucial to understanding how the other particles acquire mass.

Lepton. Fundamental fermion with no hadronic charge or colour. There are six leptons, each with a corres-ponding antiparticle: the electron, muon and tau and three corres-ponding neutrinos associated with them.

M theory. Theory which, starting out from string theory, aims to

achieve a “theory of everything” able to unify the four fundamental forces of nature. First devised by Edward Witten, it combines the five superstring theories with supergrav-ity in eleven dimensions.

Phonon. Quasiparticle or quantised mode of vibration that occurs in crystalline networks, such as the atomic network of a solid.

Photon. E lementary part ic le responsible for the quantum mani-festations of electromagnetic phe-nomena. It is the particle that mediates the electric and mag-netic forces, and which consti-tutes all forms of electromagnetic radiation, including gamma rays, X-rays, ultraviolet l ight, visible light, infrared light, microwaves and radio waves.

Planck constant. Relationship between the quantity of energy and frequency associated with a par-ticle. It plays a central role in quan-tum mechanics. If we cancel it out, we get Classical Mechanics.

Quantum chromodynamics. (QCD). Quantum field theory that describes one of the fundamental forces, namely the strong inter-action that is responsible for the forces between quarks.

Quantum correction. Some quantities, such as spin, are intrinsic to quantum mechanics and have no analogue in classical mechanics. But in other cases, when we calcu-late the value of an observation, there is a contribution from classical mechanics, to which we add a quantum correction proportional to the Planck constant.

Quarks. Fundamental particles with “strong” charge and (fractional) electric charge. Two of them (up and down) are the components of protons and neutrons.

Spin. A quantum effect which implies there is a residual angular moment even in the reference sys-tem in which the particle is at rest. If

the spin is an integer (0,1,2) the par-ticle is called a boson, and if it is semi-integer (1/2, 3/2), it is called a fermion. The Pauli exclusion prin-ciple, which states that no two identical fermions can occupy the same quantum level simultaneously, underlies the structure of Men-deleev’s periodic table.

Standard model of particle physics. Theory which describes the known relationships and inter-actions between the elementary particles that make up matter. How-ever, the standard model is not a complete theory of all fundamental interactions as it does not include a quantum theory of gravitation, the fourth known fundamental interac-tion.

String theory. Model of Physics that assumes that particles are not a zero dimensional “point” without internal structure but a miniscule string that vibrates in a space-time of more than four dimensions. This theory, expanded upon by others such as superstrings or M theory, aims to get away from the concept of a particle as a point.

Supersymmetry. Also known by the initials SUSY, supersymmetry is a hypothetical symmetry relating the properties of bosons and fer-mions. Although it has not been verified experimentally, this is a natural symmetry, constituting a fundamental part of many theoret-ical models, including superstring theory.

Symmetry. Equivalence of several possibilities. In any empty space, all directions are equivalent. A perfect cube has a lower degree of sym-metry (only turning it through 90º leaves it unchanged). Some sym-metries are “internal” (i.e. within mathematically defined “spaces”). For example, the up quark and the down quark are identical, if we ignore their different masses or electric charges. They “point” up or down in an “internal” space.

Matter and Energy Glossary

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 27

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … ON MATTER AND ENERGY 02.3

The current theory of ele-mentary particles is known as the standard

model, which is possibly a sign of physicists’ overconfi-dence. For reference, i t i s wor th d i s t i ngu ish ing between the particles that describe matter, such as quarks , and those that describe interactions, such as photons. Another important concept is that, due to the Heisenberg uncertainty princi-

M theory (strings)

Enrique Álvarez

Universidad Autónoma de Madrid

What are things made of? This question is as old as the world. We all know that there are molecules, and that they are made up of atoms, which in turn contain electrons and a nucleus of protons and neutrons, which are in turn made up of quarks and gluons, etc. But does this list have an end or will new layers of the cosmic onion appear as our ability to perceive them improves?

ple, to explore ever small dis-tances experimentally we need to give particles ever greater amounts of energy. This is done using big acceler-ators like the LHC (Large Hadron Collider) at the Euro-pean particle physics labora-tory (CERN) in Geneva, which is currently the world’s most powerful collider.

We know of four (and only four) fundamental interactions:

two –the electromagnetic and gravitational interactions– are long-range (and therefore, manifest themselves on the macroscopic level), and two are short range (of the order of the radius of a nucleus, 10-13), namely the strong (respon-sible for binding quarks together into elementary par-ticles) and weak interactions (responsible for beta radi-ation). All interactions are in principle reducible to these

The standard model accurately describes the strong, electromagnetic and weak interactions

28 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

02.3 … ON MATTER AND ENERGY |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

four fundamental ones. In par-ticular, chemistry (and conse-quently all of biology) can be reduced to the electromagnetic interaction, as can all con-densed matter physics. At the level of elementary particles, gravitation is by far the weak-est interaction, and its effects are not expected to be impor-tant until an energy per parti-cle of the order of what is called the Planck energy is reached, which is around 1016

times bigger than anything that can be achieved in the LHC.

The standard model just m e n t i o n e d a c c u r a t e l y describes the strong, electro-magnetic and weak inter-actions. All interactions are described using gauge the-ories, which are generalisa-tions of electromagnetism. In the same way that the pho-ton is associated with the electromagnetic field, all the fields that represent funda-mental interactions have their own associated particle. It should be noted that the Higgs particle still remains to be found, and doing so is one of the LHC’s main challenges,

as its existence is essential to explaining the masses of the intermediate W± and Z0 bos-ons of the weak interactions. These were discovered by the physicists van der Meer and Rubbia at CERN in 1984, for which they were awarded the Nobel Prize.

However, neither the stand-ard model nor any of the extensions made to it are able to incorporate gravita-tion. The reason that it is diffi-cult to incorporate gravitation is that when quantum effects are calculated the results obtained diverge in such a way that it is not possible to absorb the divergences in a finite number of parameters (relating these parameters is know as renormalisation and involves treating the charges and masses observed experi-mentally as the sum of a free contribution and the contri-bution from the interaction. Both are divergent, but in such a way that their sum is finite and coincides with the observed data), which is what does happen with the masses and charges in the standard model, which is renormalisable in the sense just described. Moreover, there are signs that it is not possible to treat gravity in a way that disregards the other fundamental interactions of nature. The reason is that gravity is universally coupled

Enrique Álvarez

Has a PhD in physics (1975), ob-tained under the supervision of Pro-fessor Lluis Bel. He has been a vis-iting researcher at Paris, Princeton and Harvard, and has spent long periods at the European Particle Physics Laboratory (CERN) in Ge-neva.

His research focuses are on quantum theory of the elementary particles, in particular, quantum gravity and string theory.

He is currently working in the Depart-ment of Theoretical Physics at the Madrid Autonomous University and the UAM/CSIC Institute of Theoreti-cal Physics on the UAM’s Cantoblan-co campus.

For reference, it is worth distinguishing between the particles that describe matter, such as quarks, and those that describe interactions, such as photons

Enrique Álvarez.

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 29

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … ON MATTER AND ENERGY 02.3

(with the same intensity) to all forms of matter/energy. Thus, when studying resistance to the propagation of a graviton, for example, all forms of mat-ter have exactly the same importance.

It was in this context that string theory was resurrected in the early 1980s. The basis of these theories lies in posit-ing that the fundamental enti-t ies are not particles but extended objects ca l led strings. I say it was resur-rected because strings had actually been put forward before to explain strong inter-actions. However, this line was abandoned when a much simpler explanation was discovered, namely quantum chromodynamics (QCD), which earned Gross and Wilczek the Nobel Prize in 2004. Moreover, strings had their own problems of internal consistency. In par-ticular, there is no fundamen-tal state of minimum energy for them (there is an excita-tion of imaginary mass in their spectrum such that the corresponding energy is nega-t ive and, in fact, has no lower bound) . However, researchers realised that by changing the energy scale of the strong interactions to the (much larger) scale corre-sponding to the gravitational interactions, i.e. equivalent to the Planck mass (energy)

and positing the existence of symmetry between matter and interaction, called super-symmetry, this excitation of imaginary mass of the spec-trum would disappear and the theory would incorporate Einstein’s gravitation in a nat-ura l way at i ts very low energy limit. For this to be consistent it was also neces-sary to postulate the pres-ence of a number of extra dimensions, as the theory needed 10 d imens ions instead of the usual four. Nat-urally, this is only possible if these dimensions close up on themselves to form circles of minute radius, such that they cannot be observed unless there is suff ic ient energy to allow this scale of distances to be resolved (recall again the relationship between distance and energy imposed by the uncertainty principle).

For the first time there was a model that might enable uni-fication of gravitation and quantum mechanics in a way that was not obviously incon-sistent. In the model there are open strings with free ends, and closed strings, like loops. There are two param-eters in the theory: the first is the tension of the string, usually expressed as a func-tion of the scale of lengths of the string ls, which indicates how different the string is

from a set of quantum fields, which are the entities that describe the dynamics of elementary particles (at the limit where ls = 0 the string is reduced to a point, making it physically equivalent to a particle). The second param-eter is the coupling constant, which is determined by the value of a field called dilaton, which indicates the probabil-ity that an open string turns into a closed one. The name dilaton comes from the fact that this field reflects the behaviour of the system under dilations. Our ability to make predictions based on the theory depends to a large extent on the coupling constant’s being sufficiently small for it to be possible to use a technique known as perturbation theory (which consists of starting from a known result and dealing with the interaction as a small modification of it). As noted, this depends on the intensity of the dilaton field.

But from this new point of view the particles are no longer elementary, instead they are quantised excita-tions of a string, that is to say, a series of discrete energy levels, something like the dis-crete levels of an atom or nucleus. After all, in quantum mechanics there is a wave/particle duality, and in strings this is handled naturally.

We know of four (and only four) fundamental interactions: two long-range ones (the electromagnetic and gravitational interactions) and two short-range ones, the strong interaction (responsible for binding together the quarks in elementary particles), and the weak interaction (responsible for beta radiation)

30 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

02.3 … ON MATTER AND ENERGY |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

Excitations of open strings are gauge fields, which, as we saw at the start are those that describe the fundamental interactions. The excitations of the closed strings include the graviton, the hypothetical quantum of gravity, which can also be considered a kind of gauge theory.

However, there is not just one string theory, but five, appar-ently very different ones. For taxonomic purposes they are: Type I SO(32), which is the only one that has both open and closed strings, along with four others which only include closed strings, namely IIA, IIB, Heterotic E8 × E8 and Heter-otic SO(32). This is a little embarrassing for a theory that aims to achieve a unified description of all interactions. However, researchers have begun to discover relation-ships between these five the-ories.

Typically these relationships, called dualities, relate a the-ory with weak coupling (i.e. the coupling constant is small and perturbation theory is valid) with another theory with strong coupling (in which the coupling constant is large and per turbat ion theory cannot be appl ied). This same phenomenon makes it very difficult to verify explicitly that these dualities are cor-rect. There are some obvious

necessary conditions for the dualities to be valid, such as the fact that the spectrum of energy levels must be identi-cal in the two theories. All the conditions that it has been possible to study to date are properly complied with. On the other hand, the need soon arose to introduce branes (a neo log ism to abbreviate the expression mult i -d imens iona l mem-branes), which are objects with two or more spatial dimensions (strings have just one spatial dimension) which are topological defects into which open strings’ ends are embedded. For example, an ordinary membrane, like the skin of a drum, is a 2-brane. Earlier I said that open strings have free ends. However, it turns out that these ends need to have a special brane called a D-brane as their end-point (where D stands for Dirichlet, a 19th century Ger-man mathematic ian who studied boundary conditions of this type in differential equations). What is special about str ings (which are 1-branes) with respect to other branes is that it is pos-sible to study the energy spectrum of their quantum fluctuations using the pertur-bat ion theory mentioned above.

The fact that the coupling constant is determined by

the value of the di laton, which is itself variable, led to a reinterpretation of the dila-ton as the radius of a certain extra dimension; in fact, it has been shown that a the-ory without a dilaton is pos-sible if an extra dimension is inc luded, such that the radius of this extra dimension is related to the dilaton (the bigger the radius, the more tightly coupled the corre-sponding string theory): this is the famous M theory in 11 dimensions. The dimensions can be small circles, in which case we call them compact. The three ordinary dimen-sions, on the other hand, are not compact as they extend indefinitely. When al l the dimensions are non-compact this theory does not have a coupling constant, which means it cannot easily be subjected to perturbative analysis. It is ironic how little we know about our candi-date theory of everything.

In reality we only know what this theory is like at the limit of weak gravity, which according to the theory of general relativ-ity is equivalent to small curva-tures. At this limit it turns exactly into a modification of Einstein’s general relativity, namely, supersymmetric grav-ity, specifically one which incorporates all of super-symmetry without running into inconsistencies. It also con-

However, there is not just one string theory, but five, apparently very different ones

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 31

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … ON MATTER AND ENERGY 02.3

tains the graviton (which you will recall is the particle asso-ciated with the gravitational field), and a field of matter called gravitino, a brane in three spatial dimensions and another brane of five dimen-sions, both related by the duality symmetry, which I alluded to above. As men-tioned earlier, certain limits in which M theory reduces to one of the five string theories mentioned are also explicitly known.

The s i tuat ion somewhat resembles Plato’s myth of the cave: the only shadows which are not accessible to M theory are precisely those corners in which the five different string theories can be treated in a perturbative way (a small dila-ton equates to a small cou-pling constant). This happens when one of the 11 dimen-sions is a circle of minute radius which we identify with the dilaton and therefore, with the coupling constant of the corresponding string theory. We are still a long way from understanding what l ies behind this M theory in the general case.

The problems connecting with the low energy world (which in this context means everyth ing that is much smal ler than the Planck energy scale) derive from the fact that the scale of com-

pactification, i.e. the size of the additional dimensions, has to be small enough for them to be impossible to observe in today’s acceler-ators. The existence of this new scale of lengths is inev-itable as the low energy world (in the sense just defined) is not supersymmetrical (there is no equivalence between matter and interaction) nor does it have more than four visible ordinary (non-com-pact) dimensions. What hap-pens at this scale is a mech-anism known as symmetry b reak ing . A t d i s tances smaller than this scale, the eleven dimensions of M the-ory are visible and super-symmetry is exact; for distances greater than this scale, i.e. bigger than the scale of com-pactification, string theory does not show its “stringy” peculiarities, and it is not easy to observe its predic-tions, as it looks like a four-dimensional field theory with certain non-string-specific peculiarities.

To conclude, the more one explores the various aspects of M theory, the more mysteri-ous it seems and new, often unexpected, avenues open up before us.

It remains to be seen if we are able to understand the under-lying physical principles and make specific predictions that

we can confirm experimen-tally. Time will tell.

Meanwhi le, i t is perhaps worth noting that under cer-tain conditions that we do not fu l l y understand M/string theory allows pertur-bative regimes in which the couplings are weak and the calculations possible in these regimes take us back to pre-viously known results (such as general relativity). This is extremely important, as it al lows us to get our feet back on the ground. In my opinion, there is an unjusti-fied tendency to consider this fact irrelevant. However, in all previous revolutions in physics there has always been a clear limit at which the new revolutionary exten-sion could be reduced to previously existing theories, thereby clearly setting the bounds to its range of valid-ity with reference to earlier theories (speeds much lower than that of light in a vacuum in case of special relativity, or small energies per time in rela-tion to the Planck constant, in the case of quantum mechanics). For strings this role is played, as I have men-tioned, by the dilaton and the l s length scale. When both are cancelled out, we return to ordinary quantum mechanics coupled to gen-eral relativity and the oft-mentioned gauge fields.

But neither the standard model nor any of its extensions are able to account for gravitation

03… in the exploration

of the universe

34 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

03.1 … IN THE EXPLORATION OF THE UNIVERSE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

Enrique Gaztañaga

Instituto de Ciencias del Espacio (CSIC)

The darkness of the cosmosOrdinary matter alone is unable to account for the formation of the galaxies and stars that we observe. Therefore, few cosmologists doubt that “cold dark matter” or something similar, must exist. However the complications do not end there: as well as dark matter, the evidence also suggests that there is a new source of energy, so-called dark energy.

It is estimated that most of the matter making up the universe is dark, or not very

luminous, and that it largely consists of an exotic sub-stance called “cold dark mat-ter” (CDM), which is unlike anything we know from cur-rent physics. The term “cold” refers to the fact that it does not interact –or does so only weakly– with other known forms of matter, except by gravitational attraction, which is assumed to be universal.

What evidence do we have for this? Firstly, it has been calculated that the density of ordinary or baryonic matter,

which is to say that made up of the known particles (i.e. those included in the stand-ard model) accounts for just 4% of the critical energy den-sity, which is that derived from the expansion of the cosmos, and which is equiva-lent to approximately 5 pro-tons per cubic metre. Sec-ondly, the movement of the galaxies (and the anisotropies in cosmic background radi-ation) tell us that the total density of matter is about six times greater than that of baryonic matter, or around 25% of the critical value. Hence, it has been con-cluded that most matter

(21% compared with 4%) is not made up of ordinary mat-ter and interacts weakly with it, i.e. it is cold dark matter (CDM). Moreover, this exotic matter is essential in order to understand the formation of stars and galaxies as we observe them. Ordinary mat-ter alone is unable to account for the formation of the galax-ies and stars that we observe. Therefore, few cosmologists doubt that CDM, or some-thing similar, must exist.

However the complications do not end there: as well as dark matter, the evidence also suggests that there is a new

source of energy, called dark energy.

As is well-known, in 1915 and Albert Einstein proposed the theory of general relativity (GR), which incorporated gravity in his theory of special relativity, and successfully replaced the theory of gravity put forward by Isaac Newton in his Principia (1687). Accord-ing to GR, the presence of the Earth (or any other form of matter-energy) produces a curvature of space time that alters the movement of bod-ies, giving the appearance that objects fall due to the force of gravity. A spectacular

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 35

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE EXPLORATION OF THE UNIVERSE 03.1

confirmation of this new the-ory came with the so-called “gravitational lens” effect, whereby light (which does not have mass, but does have energy) is bent as it passes by a massive body. In 1919, Arthur Eddington observed this bending dur-ing a so lar ec l ipse. For once, the headlines did not report just the news of the war or football, but claimed “light does not travel in a st ra ight l ine,” “space is curved,” “Newton’s theory is wrong.”

If we apply GR to the cosmos as a whole, the movement of its constituent parts must be due to the presence of matter (or energy). In other words: studying the motion of the

cosmos reveals its contents. The equations that govern this relationship are known as the Friedmann equations (1922). These derived directly from the GR and reduce to New-ton’s equations, just as they are stil l taught at school. These are the equations of conservation of energy and motion which derive from applying simple symmetry cri-teria: the laws of physics are the same at all times and in all places in the cosmos. These equations relate the measure-ments of cosmic movement to measurements of its con-tents and curvature.

The big surprise was to see that, given the density of matter in the cosmos, the observed rate of cosmic

expansion (Hubble’s law) was, approximately, that predicted by these equations. This encouraged cosmologists to make increasingly precise measurements. In the last decade, the measurements of the curvature of the cosmos, made with maps of cosmic radiation (WMAP), the galaxy maps (STSS) and the meas-urements of distances to supernovae, suggest that our universe is flat (Euclidean geometry applies). That is to say, overall, we can say that the cosmos does not have any curvature, to a precision (and accuracy) of greater than 1%. According to the GR equa-tions this implies there must be an almost perfect cancella-tion between the energy den-sity due to expansion and the

Enrique Gaztañaga

Is an astrophysicist with a PhD in physics from the University of Barcelona (1989). He did his postdoctoral training at NASA/Fermilab Center for Theoretical Astro-physics (University of Chicago) and at the Astrophysics department of the Univer-sity of Oxford, United Kingdom. He has held the endowment chair at the National astrophysics Institute, optics and electronics in Mexico and has been a tenured lecturer in theoretical physics at the University of Barcelona.

He is currently a research professor at the Institute of Space Scienc-es (www.ice.csic.es) and Institut d’Estudis Espacials de Catalunya (IEEC/CSIC) in Barcelona. He is on the steering committee and scientific commit-tee of the DES (Dark Energy Survey, www.darkenergysurvey.org) and the PAU (Physics of the Accelerating Universe, www.pausurvey.org) galaxy map-ping surveys. His current research focuses on the study of the character-isation and origin of the large scale structure and content of the universe.

density of the energy-matter content of the cosmos. In other words, the energy den-sity of the cosmos must have exactly the critical value just mentioned. Measurements of dark matter indicate that there is only 25% of the critical value of energy in the form of matter. This is quite close to the critical value but does not reach it. The precision of the measurements leaves no room for doubt: 75% of what we need to balance the GR equations is missing. This 75% is so-called dark energy. The conclusion is robust and seems unavoidable: either Einstein’s theory of gravity (GR) is wrong at cosmic scales, or there must be a new form of energy.

But, what differentiates dark energy from dark matter? We could say that the former accelerates the universe whereas the latter slows it down. This is fairly easy to understand. The density of dark matter is diluted because the cosmos is expanding, which implies that the expan-sion energy is diminishing, resulting in a deceleration, while the density of dark energy remains constant, which produces an accelera-tion.

The fact that dark matter and energy provide contributions that are of a similar order of

Enrique Gaztañaga.

36 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

03.1 … IN THE EXPLORATION OF THE UNIVERSE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

magnitude (of 21% and 75%) to the rate of expansion is known as the “problem of cosmic coincidence.” It is a coincidence because in the past, due to cosmic expan-sion, the density of dark mat-ter was much greater than the density of dark energy. Why do their values approximately coincide today? Does this tell us something about their nature? Could they both be related?

There is an obvious candidate for the title of “dark energy.” It is simply the density of energy in a vacuum or the fundamen-tal state of matter. Its value does not change the other interactions of matter (electro-magnetic and nuclear forces), which only depend on the dif-ference between various exci-tation states but not on their absolute value. The problem with this idea is that the meas-ured density of dark energy is extremely low: just a few pro-tons per cubic metre. There-fore, its value does not match the typical energy densities in the standard particle model, which are at least one proton per nuclear volume unit (the characteristic radius of an atomic nucleus is a thou-sand-trillionth of a metre, that is to say a metre divided by 1,000,000,000,000,000= 1015). There is a difference of 45 orders of magnitude (1045) with respect to the value of

dark energy. It could be argued, given that dark matter interacts very weakly, that its vacuum energy may be much lower. But in this case we would have to explain why only the vacuum energy of dark matter is important, and why we should ignore the contribution to vacuum energy from ordinary matter. And, what is worse, according to quantum physics, the vacuum is populated by energy fluctu-ations according to the so-called “Heisenberg uncer-tainty principle” (1927): the quantum energy density char-acteristic of gravity (the so-called Planck energy) is 10120 t imes greater than dark energy. Does this mean there is a conflict between GR and quantum physics?

In GR, the vacuum energy can also be interpreted as a f u n d a m e n t a l c o n s t a n t , called the cosmological con-stant. Its existence could simply be posited as the solution to the problem of dark energy. But the data do not yet allow us to be sure that it is a constant (although the margin for any possible variation is only 5%). Other popular candidates for the title of dark energy are so-called “quintessence” or the possibil ity of more spatial d imensions (beyond the three known). There are mul-t ip le var iat ions on these

ideas. Unlike the case of the cosmological constant, in these other models the value of the density of dark energy does not remain constant and varies sl ightly as the cosmos evolves. If we were able to measure a variation in the density of dark energy, we might be able to distin-guish between the different models.

We are at a turning point: either we introduce these dark components, which has echoes of the obsolete theories of the Ether which were introduced at the end of the 19th century to combine electromagnet-ism and Newtonian dynam-ics, or we have to change the fundamental laws of physics and general relativity, in par-ticular. How should we pro-ceed? No doubt, with more observations: drawing up bigger maps of the cosmos (which therefore go back fur-ther in time).

There is a simple way to tell if we really need to modify the GR or admit to the existence of dark energy. The idea is to compare the history of the expansion of the universe with a history of how it has developed. As well as a study of cosmic expansion, cos-mology describes the growth of structures. Starting with a uniform distribution of radi-ation and elementary particles,

Astrophysics has two observational techniques: photometry, which consists of taking pictures of the sky, and spectroscopy, which disperses a beam of light to split it into its different colours or wavelengths

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 37

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE EXPLORATION OF THE UNIVERSE 03.1

it tries to explain how the stars and galaxies formed, giving rise to our existence. Given a history of expansion based on observations, GR’s equations predict a single history for growth. How can we measure this growth?

Astrophysics has two obser-vational techniques: photom-etry, which consists of taking pictures of the sky, and spec-troscopy, which disperses a beam of light to split it into its different colours or wave-lengths. The d ispers ing medium acts like a simple prism, which refracts the light into different colours at differ-ent angles, producing a rain-bow or spectrum. The spec-trum identifies the chemical composition, relative abun-dance and physical state of the matter/object emitting the l ight. We have col lected spectra from more than a mil-lion galaxies and have pho-tometry data (in various col-ours) from tens of millions of galaxies.

Why do we need so many spectra? We are trying to take a census of the universe, with the positions of galaxies in a radius of billions of light years (the distance light travels in a year). These enormous maps also trace the history of the cosmos, like a book, where light collected after the longest delay comes from the most

distant points. We have meas-ured delays of billions of years for the most remote objects, when neither the Earth nor the

Sun had yet been born. It is a time machine that literally lets see how galaxies are born and age.

The distance of the galaxies in our map is inferred from an analysis of the spectrum of each galaxy. The result is a

The figure shows a slice through the equator (inclination +/-1.25 degrees). The Earth is in the centre of the image and the radial direction in the figure is redshift z (cosmic distance-time). The galaxies are coloured according to the age of their stars (which we know from their spectrum): the redder the star, the older it is. The denser areas (clusters) are older and, if you look closely, they extend in the direction of the observer due to the high speeds characteristic of the galaxies. Source: www.sdss.org

/// Figure 1. Distribution of the galaxies in the cosmos //////////////////////////////////////////////////////

38 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

03.1 … IN THE EXPLORATION OF THE UNIVERSE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

three-dimensional (3D) distri-bution of the galaxies around us. An example of the result-ing map is shown in Figure 1. The Milky Way is a tiny dot at the centre of this figure. The galaxies are distributed in great walls and filaments that intersect in enormous clusters of galaxies around relatively empty areas.

One thing that stands out in the map in Figure 1, is how the contrast (and therefore the average amplitude) of the structures tends to diminish as we move away from the centre. These maps show the evolution of growth over time (recall that distance is also time). Additionally, the correl-ations between the galaxies give us a way of calibrating distances in the cosmos, which translate into measures of the acceleration or of the history of its expansion. In this way, maps of galaxies can test GR: we can measure the rate of expansion and use it to predict the rate of growth. Does this prediction match the rate observed?

Today, the measurements are not sufficiently precise or detailed to be able to perform this type of test conclusively. We are therefore preparing new, more detailed and more accurate maps of the galax-ies. These new maps aim to reach distances 5 to 10 times

further than those in Figure 1 and are divided into two types: spectroscopic and photometric. The former (such as SDSS-III, GAMMA, Gig-gleZ) are similar to the map shown in the figure, in 3D, but use more resources or select only certain objects in order to increase the distance. The lat-ter (such as Pan-STARRS, DES and LSST) are angular maps (2D) where the red shift (or radial distance) is only known approximately (with an error of 10%) using various colour measurements. The loss of radial resolution is compensated for by the area, the depth or the large number of galaxies sampled.

In Spain, we are leading a new cartography exercise, called PAU (Physics of the Accelerat-ing Universe, partly funded by the Ministry of Science and Innovat ion ’s Conso l ider Ingenio 2010 programme) which uses a new mixed tech-nology, part way between spectroscopy and photome-try. The purpose is to draw up a photometric map using 40 different colour filters so that in each image of the map we also have a low resolution spectrum. To do this, in our laboratory we are now con-structing the PAUCam cam-era, which will allow us to implement this new technol-ogy. The resulting map will have an optimal spatial resolu-

tion with which to study the dark universe. Our plan is to install the PAUCam in late 2012 on the La Palma WHT telescope (www.ing.iac.es). The new mapping surveys require more precision and better control over systemic errors and the details of the theoretical models. Therefore, we are also working on a new data processing system adapted to PAU and on more extensive and detailed simula-tions (see www.ice.cat/mice). The PAU mapping survey will give a unique combination of resolution, depth and sam-pling density. Altogether this will allow us to exploit new techniques, in addition to those already mentioned. PAU will be able to measure the rate of growth of dark matter directly (almost in 3D) using the gravity lens effect. It will also have sufficient reso-lution to measure the peculiar movements of the galaxies statistically (see Figure 1). The combination of these measurements will make it possible to bound the value of cosmic acceleration and the rate of evolution of the growth of structures simultaneously and to a high degree of preci-sion. The final precision will depend on the size of the PAU mapping survey, but there is no doubt that, if successful, in a few years PAU may help resolve the mysteries of this dark cosmos.

In Spain, we are leading a new cartography exercise, called PAU (Physics of the Accelerating Universe, partly funded by the Ministry of Science and Innovation’s Consolider Ingenio 2010 programme) which uses a new mixed technology, part way between spectroscopy and photometry

At

”la Caixa” Welfare

Projects we collaborate with

non-profit organisations that, like

us, work to achieve a society with greater oppor-

tunities for everyone. This is why we favour projects that directly tackle

those problems that are emerging in our society and we support innovative initiatives

that are not covered by other subsidies. We believe that, in an egalitarian society, everyone must

have the right to having their basic needs met, such as housing, health and food… At ”la Caixa” Wel-

fare Projects we collaborate with non-profit organisations that, like us, work to achieve a society with greater

opportunities for everyone. This is why we favour projects that directly tackle those problems that are emerging

in our society and we support innovative initiatives that are not covered by other subsidies. We believe that, in an

egalitarian society, everyone must have the right to having their basic needs met, such as housing, health and food…

At ”la Caixa” Welfare Projects we collaborate with non-profit organisations that, like us, work to achieve a society with

greater opportunities for everyone. This is why we favour projects that directly tackle those problems that are emerging

in our society and we support innovative initiatives that are not covered by other subsidies. We believe that, in an ega-

litarian society, everyone must have the right to having their basic needs met, such as housing, health and food…

At ”la Caixa” Welfare Projects we collaborate with non-profit organisations that, like us, work to achieve a society

with greater opportunities for everyone. This is why we favour projects that directly tackle those problems

that are emerging in our society and we support innovative initiatives that are not covered by other

subsidies. We believe that, in an egalitarian society, everyone must have the right to having

their basic needs met, such as housing, health and food… At ”la Caixa” Welfare Projects

we collaborate with non-profit organisations that, like us, work to achieve

a society with greater opportunities for everyone. This is why we

favour projects that directly tackle those problems that

are emerging in our society and we

At

”la Caixa”

Welfare Pro-

jects we collabora-

te with non-profit

organisations that,

like us, work to

achieve a society with

greater opportunities

for everyone. This is

why we favour projects

that directly tackle

those problems that are

emerging in our society

and we support innovative

initiatives that are not

covered by other subsidies.

We believe that, in an ega-

litarian society, everyone

must have the right to having

their basic needs met, such as

housing, health and food…

At ”la Caixa” Welfare Projects

we collaborate with non-profit

organisations that, like us,

work to achieve a society with

greater opportunities for ever-

yone. This is why we favour

projects that directly tackle tho-

se problems that are emerging

in our society and we support

innovative initiatives that are

not covered by other subsidies.

We believe that, in an egalitarian

society, everyone must have

At ”la Caixa” Welfare Projects we collaborate with

non-profit organisations that, like us, work

to achieve a society with greater opportu-

nities for everyone. This is why we

favour projects that directly tackle

those problems that are emerging

in our society and we support

innovative initiatives that

are not covered by other

subsidies. We believe

that, in an egali-

tarian society,

everyone must

have the

right to having

Further information at www.laCaixa.es/ObraSocial

2011 Applications

This year, at ”la Caixa” Welfare Projects we are once more opening the period

of applications for accessing our programme of help for social initiatives.

If you know of any local non-profit organisation that wants to set up a project

to respond to society’s emerging needs, please inform them about this new

period for applications. Between us all, we can create opportunities that

improve the quality of life of those who most need to do so.

We promote projects that create opportunities

PROGRAMME OF HELP FOR PROJECTS THAT DRAW SMILES

At

”la Caixa” Welfare

Projects we collaborate with

non-profit organisations that, like

us, work to achieve a society with greater oppor-

tunities for everyone. This is why we favour projects that directly tackle

those problems that are emerging in our society and we support innovative initiatives

that are not covered by other subsidies. We believe that, in an egalitarian society, everyone must

have the right to having their basic needs met, such as housing, health and food… At ”la Caixa” Wel-

fare Projects we collaborate with non-profit organisations that, like us, work to achieve a society with greater

opportunities for everyone. This is why we favour projects that directly tackle those problems that are emerging

in our society and we support innovative initiatives that are not covered by other subsidies. We believe that, in an

egalitarian society, everyone must have the right to having their basic needs met, such as housing, health and food…

At ”la Caixa” Welfare Projects we collaborate with non-profit organisations that, like us, work to achieve a society with

greater opportunities for everyone. This is why we favour projects that directly tackle those problems that are emerging

in our society and we support innovative initiatives that are not covered by other subsidies. We believe that, in an ega-

litarian society, everyone must have the right to having their basic needs met, such as housing, health and food…

At ”la Caixa” Welfare Projects we collaborate with non-profit organisations that, like us, work to achieve a society

with greater opportunities for everyone. This is why we favour projects that directly tackle those problems

that are emerging in our society and we support innovative initiatives that are not covered by other

subsidies. We believe that, in an egalitarian society, everyone must have the right to having

their basic needs met, such as housing, health and food… At ”la Caixa” Welfare Projects

we collaborate with non-profit organisations that, like us, work to achieve

a society with greater opportunities for everyone. This is why we

favour projects that directly tackle those problems that

are emerging in our society and we

At

”la Caixa”

Welfare Pro-

jects we collabora-

te with non-profit

organisations that,

like us, work to

achieve a society with

greater opportunities

for everyone. This is

why we favour projects

that directly tackle

those problems that are

emerging in our society

and we support innovative

initiatives that are not

covered by other subsidies.

We believe that, in an ega-

litarian society, everyone

must have the right to having

their basic needs met, such as

housing, health and food…

At ”la Caixa” Welfare Projects

we collaborate with non-profit

organisations that, like us,

work to achieve a society with

greater opportunities for ever-

yone. This is why we favour

projects that directly tackle tho-

se problems that are emerging

in our society and we support

innovative initiatives that are

not covered by other subsidies.

We believe that, in an egalitarian

society, everyone must have

At ”la Caixa” Welfare Projects we collaborate with

non-profit organisations that, like us, work

to achieve a society with greater opportu-

nities for everyone. This is why we

favour projects that directly tackle

those problems that are emerging

in our society and we support

innovative initiatives that

are not covered by other

subsidies. We believe

that, in an egali-

tarian society,

everyone must

have the

right to having

Further information at www.laCaixa.es/ObraSocial

2011 Applications

This year, at ”la Caixa” Welfare Projects we are once more opening the period

of applications for accessing our programme of help for social initiatives.

If you know of any local non-profit organisation that wants to set up a project

to respond to society’s emerging needs, please inform them about this new

period for applications. Between us all, we can create opportunities that

improve the quality of life of those who most need to do so.

We promote projects that create opportunities

PROGRAMME OF HELP FOR PROJECTS THAT DRAW SMILES

40 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

03.2 … IN THE EXPLORATION OF THE UNIVERSE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

After centuries of specu-lating about whether other planets exist

outside the solar system, and whether the Earth and our Solar System are special, in the 1990s there was an authentic revolution in the search for exoplanets, with the discovery, and then more recently, the characterisation (determining the planets’ general properties and even

The exoplanetary bestiary

David Barrado-Navascués

Centro Astronómico Hispano-Alemán (MPG-CSIC) and Centro de Astrobiología (INTA-CSIC)

Anthropocentrism, the notion that man is at the centre of the universe, has always been present in science and philosophy. Time and again, reality puts us back where we belong, on a miniscule planet somewhere in an ordinary galaxy.

features of their atmospheres in some cases), of other planets. The starting pistol was fired by a Swiss group led by Michael Mayor in 1995 using a technique based on changes in the radial velocity of the central star, miniscule variations induced by the presence of a planet with a much smaller mass than the star it orbits. Since then more than 500 exoplanets have

been detected, including about 100 planetary systems (i.e. stars with two or more planets).

Several factors have influenced this surge in discoveries. These range from the variety of tech-niques developed, to the sophistication of the theoretical models applied, and the use of instruments at the cutting edge of technology.

In terms of the techniques used, as well as radial velocity (based on the Doppler effect, which is the change in fre-quency of a wave, such as l ight or sound, when the transmitter and receiver are mov ing re l a t i ve to one another), planetary transits (analogous to the eclipses of the Sun) begin to dominate the scene, given that we can now derive generic properties

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 41

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE EXPLORATION OF THE UNIVERSE 03.2

Saturn as observed by the Cassini-Huygens mission (NASA/ESA/ASI) with the Sun eclipsed by the planet’s surface. Although there is a certain amount of material around this gas giant, in the form of numerous satellites and rings, Saturn dominates the group. Source: NASA.

WHAT IS A PLANET?

At its general assembly in Prague in August 2006, the International Astronomical Union (IAU), came up with a definition of the term planet, at least insofar as the solar system is concerned. According to this, a planet is a celestial body that: (a) is in orbit around the Sun; (b) has suf-ficient mass for its own gravity to overcome its rigidity and give it an approximately round shape (due to hydrostatic equilibrium); and, (c) clearly dominates its neighbourhood, having cleared its orbit of other objects.

On this definition Pluto is no longer a planet. Instead it has become the prototype of a new class of object, called the dwarf planets. The solar system is therefore left with eight planets: Mercury, Venus, Earth, Mars (the so-called rocky or terrestrial planets) and Jupiter, Saturn, Uranus and Neptune (the gas giants).

Pluto is no longer a planet. It has become the prototype of a new class of object, called dwarf planets

42 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

03.2 … IN THE EXPLORATION OF THE UNIVERSE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

DAVID BARRADO-NAVASCUÉS

Has a Ph.D. in physics from the Madrid Complutense University, and continued his research at the Harvard Smithsonian Centre for Astrophysics in Cambridge (USA). He has worked as a postdoctoral researcher at various institutions in the US, Germany and Spain. He has also worked at the European Space Astronomy Centre (ESAC) as a member of the Spanish national institute of aerospace technology (INTA). On the INTA team he was the lead researcher for MIRI, the mid-infrared instrument that will fly onboard the future JWST space telescope. Subsequently, he joined the Astrobiology Centre (CAB), a joint Institute established by the Spanish aerospace agency (INTA) and the Spanish National research Council (CSIC). Since early 2010 he has been the director of the Spanish-German Astronomical Centre, Calar Alto Observatory.

His research interests focus on finding and characterising sub-stellar objects, and the properties of stars in open clusters. He has published over 100 articles in leading scientific journals.

TYPES OF PLANETS

Gas giantsThe gas giants are primarily composed of gases, in particular hydrogen and helium. In our Solar System this category includes Jupiter, Saturn, Uranus and Neptune, although in the latter ice is a significant component. The gas giants, depending on how they formed, do not nec-essarily have a solid rock core, but may consist of a continuum of gas that becomes gradually denser and acquires fluid properties as the pressure increases towards the centre. In the case of Jupiter and Saturn gaseous hydrogen in the molecular state gives way to a state known as “metallic hydrogen,” which has some unusual properties. The vast majority of extrasolar planets discovered are gas giants, at least partly as a result of the fact that current detection methods discriminate more massive planets better.

Rocky planetsRocky planets, also known as terrestrial or telluric planets, are mostly composed of sili-cates and have atmospheres influenced by geological activity and, in the case of the Earth, biological activity. There are four rocky planets in the solar system: Mercury, Venus, the Earth and Mars. Interestingly, the first planets discovered outside our system were rocky, but could only be detected because they were orbiting a pulsar, which is an unusual type of star. Only since 1995 has it been possible to refine the detection methods for extra-solar planets sufficiently to be able to find other rocky planets. The search for and charac-terisation of planets similar to ours has become the focus of a number of space explor-ation missions.

David Barrado-Navascués.

In addition to our own planetary system, called the Solar System, in the last 15 years over 500 other planets –exoplanets– have been found orbiting other stars

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 43

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE EXPLORATION OF THE UNIVERSE 03.2

of star-planet systems (radii, and masses –if combined with study of the radial velocity, density, surface temperature), and information can even be obtained about the chemical composition of the planet’s atmosphere. Of course, this is all without too much detail. In a few cases we have even been able to obtain images of exop lanets themse lves , despite their being extremely faint in comparison with their central star.

What is a planetary sys-tem? A planetary system consists of a star (or binary star sys-tem) and all the planets or minor bodies orbiting around it. In addition to our own planet-ary system, which we call the Solar System, in the last 15 years hundreds of planets (exoplanets) have been dis-covered orbiting around other stars. This has been made possible by various observa-tional techniques including high resolution spectroscopy (radial velocity based on the Doppler effect) and high pre-cision photometry (e.g. to detect planetary transits). Some of these exoplanets are part of genuine planetary sys-tems comprising a central star and at least two planets. Before the identification of the first exoplanets using spectro-

In terms of their properties, the eight planets of our Solar System fall into two distinct groups: the dense, rocky planets similar to the Earth, and the giant planets, further out, with densities similar to that of water

EXOPLANETS OR EXTRASOLAR PLANETSThe International Astronomical Union (IAU) came up with a working definition of exoplanets in 2003. According to its definition, planets outside the solar system must be orbiting a star or stellar remnant (white dwarf of neutron star) and have a mass of less than 14 Jupiter masses. Their low mass means they do not reach the interior temperatures and densities necessary for fusion of deuterium, an isotope of hydrogen with a proton and a neutron in its nucleus, or of any other chemical element. Therefore they do not produce energy from this source.

According to this same IAU resolution, sub-stellar objects with masses greater than the above, but in which hydrogen fusion does not take place, should be called brown dwarfs. Meanwhile, isolated objects of planetary mass, but with a mass below the limit of 14 Jupiter masses, should be called sub-brown dwarfs or any other appropriate name except planet. Of course, these def-initions may be modified as our knowledge progresses.

Tethys, one of the moons of Saturn with a diameter of about 1071 km, as seen by the Cassini-Huygens mission. The data indicate that it is basically made of ice. The image (actually a mosaic of four images) shows the Odysseus crater, which has a diameter of about 450 km. Source: NASA/JPL/Space Science Institute.

44 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

03.2 … IN THE EXPLORATION OF THE UNIVERSE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

scopic techniques in 1995, circumstellar disks had been discovered around stars, including both accretion disks (remnants from the formation of the star itself) and material used in planet building (called debris disks). What is more surprising still is that planetary systems have been detected that also include circumstellar disks and are therefore at an early stage of evolution, in which exoplanets would still be in the process of formation or have only recently finished forming.

Planetary diversityIn terms of their properties, the eight planets of our Solar System fall into two distinct groups: the dense, rocky planets similar to the Earth, and the giant planets further out, with densities similar to that of water. These, in turn, are div ided into the gas giants (Jupiter and Saturn, which –l ike the Sun– are made up of hydrogen and helium, the two lightest ele-ments) and ice giants (smaller and con ta in s ign i f i can t amounts of other light ele-ments such as carbon and oxygen in frozen form).

For many years i t was assumed that if a planetary sys tem ex i s ted a round another star it would be very similar to the sun, and both the hierarchy and masses and properties of its planets would

be similar to those found in the solar system, with Earth type planets (basically rocky planets) along with other, much more massive gaseous planets (similar to Jupiter). However, with the discovery to date of 555 planets, the word that sums up their basic properties is diversity. Of them, 134 show transits (with-out counting the more than 1,000 candidates discovered by NASA’s Kepler space observatory) and it is possible in many cases to derive prop-erties such as mass and radius, allowing direct com-parison with the Earth’s sib-lings. Thus, the exoplanets, by analogy, and by comparison with theoretical models, can be grouped into the same types as the planets of the solar system. In any event, the diversity persists, and within it there are extremes, with planets that break all the rules and go way beyond the bounds of the imagination.

What is normal and what is exceptional? Although scien-tists devote a lot of their time to classif ication, it is not always clear. In the case of the extrasolar planets, the situa-tion is truly confusing.

In reality the situation is per-haps not as complex as it seems. If we look at the com-ponents of the solar system including the dwarf planets

Within this extraordinary variety are the extremes, planets that break all the rules and go way beyond the bounds of the imagination

The Moon, seen from the Galileo probe (7 December 1992) during complex manoeuvres to reach the orbit around Jupiter. The probe studied the Jovian system from 1995 to 1997. This very high resolution photograph (a composite made of several taken at different wavelengths, from violet through to very close to infrared) clearly reveals the differences between the different regions of our satellite: the Tycho impact crater (the bright region at the bottom), Oceanus Procellarum (left), Mare Imbrium (centre left), Mare Serenitatis and Mare Tranquillitatis (centre) and Mare Crisium (close to the right edge). The Moon is characterised by its extremely low water content.Source: NASA/JPL/USGS

The surface of Venus. This image is the result of many years of observations by the Magellan probe, which mapped the planet’s surface using radar, given that its dense atmosphere makes it impossible to observe its geography directly. The gaps left by this probe were completed with various instruments, including Arecibo, Venera and Pioneer Venus. Source: NASA/JPL/USGS

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 45

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE EXPLORATION OF THE UNIVERSE 03.2

and satellites, we find that we have many of these extremes close to home among our neighbours: we even live on such a planet. Of course, there is also an observational bias, as we know our solar system much better than those beyond the interstellar seas. Let’s take a look at some of the examples of extremes:

I) In terms of physical proper-ties:

•  CoRoT-Exo-1 b, the larg-est planet.

•  PSR 1257 +12 b,  the planet with the smallest mass . O r Me rcu r y i n our solar system, which has a mass comparable to it. At the other extreme, the two planets (or brown dwarfs) detected around the star Zeta Oph, each with a mass of more than 20 times that of Jupiter.

twins, having comparable mass.

•  In the environment, PSR B1620-26 (a neutron star) located in a very old globu-lar cluster (NGC6121) with thousands of stars close together.

•  2M1207  b,  the  differ-ence in mass between the centra l body (a brown dwarf ) and the planet, twenty Jupiter masses compared with f ive. In addition, the planet orbits at 46 astronomical units.

Of course, these records are easily broken. You just have to keep trying. In fact, new discoveries are being made so fast that this is probably out of date already.

Artist’s impression of a planet orbiting a pulsar, such as that discovered around B1257+12 in 1992. Pulsars are rapidly spinning neutron stars with very strong magnetic fields emitting high energy electromagnetic radiation. They are the result of the death of high mass stars, after exploding as supernovae. The planet, detected by indirect techniques, probably formed from material ejected during the explosion. Source: NASA.

•  The Earth, on account of its high density. Among the densest is XO-3 b.

•  Saturn and Mars for their shape.

•  For their chemical com-position, Jupiter, Mercury, and HD 149026 b. This group includes the satellites of the Earth and Saturn. The Moon, for its lack of water, and Tethys, for its large quantities of it.

•  For its surface tempera-ture, HD 149026 b, at approximately 2,000 Kelvin. At the other extreme, Nep-tune at 50 K.

II) In terms of their dynamic properties:

•  For their orbits, Neptune, OGLE-TR-56 and HD 80606 b.

•  For their rotation, Jupiter and Venus (very fast and very slow, respectively).

III) In terms of the configur-ation of the planetary system:

•  The most complex planet-ary systems. After the solar system, 55 CnC, with 5 exoplanets.

•  The dwarf planet Pluto with i ts moon Charon, wh i ch a re p rac t i ca l l y

There are dense, rocky planets similar to the Earth, and the giant planets further out, with densities similar to that of water

46 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

03.3 … IN THE EXPLORATION OF THE UNIVERSE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

Astronomy, like other branches of science, has accompan ied

human beings throughout their existence. From such basic and well-known facts as the influence of the Sun and the inclination of the Earth’s axis on the seasons and climate, through to its

Large astronomical facilities: astrophysics on a grand scaleThe first telescope was the human eye. Since Galileo built the first optical telescope in the 17th century telescopes have expanded their collecting area so as to increase their ability to detect ever fainter objects. The major advances made in astronomy have been linked to significant developments in terms of instruments and technologies. The large-scale installations available today or in the future arise from the need to answer scientific questions.

Antxon Alberdi

Instituto de Astrofísica de Andalucía (CSIC)

app l ica t ion to pract ica l issues such as navigation (formerly based on position-ing using the stars in the sky and today using the GPS system), ast ronomy has been a constant companion of mankind. It has had a powerful impact on our cul-ture. Nor should we overlook

the fact that astronomy deals with issues as important and interesting to us as the ori-gins of the universe, our gal-axy and its stars, our solar system, the Earth and even life itself. Altogether this has led to huge interest and extremely rapid development in this branch of science; as

well as topics relating to astrophysics being tackled f rom a mul t id isc ip l inary approach (in which physi-cists, chemists, engineers, mathematicians, biologists and geologists all take part) combining a unique mix of observations, simulations and theoretical modelling.

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 47

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE EXPLORATION OF THE UNIVERSE 03.3

Light and the telescopeThe raw material of astron-omy is the light emitted by h e a v e n l y b o d i e s . O u r research involves capturing emissions from stars across the whole energy range and the whole width of the elec-t romagne t i c spec t r um. Indeed, astronomy in the 21st-century is based on multifrequency analysis; that is to say, to understand the physics of a celestial object, we need to know about the light it emits across the whole width of the spectrum. This is because the different constitu-ents (gas, dust, stars, etc.) of an astronomical object emit light via different mecha-nisms of radiation and do so at very characteristic frequen-cies. The type of light, (i.e. electromagnetic radiation)

emitted by an astronomical object depends on its tem-perature. Thus, gamma rays are emit ted by very hot objects, with temperatures T>108 ºK, such as gamma ray bursts; conversely, star formation regions or cold dense regions in the interstel-lar medium (the so-called cold universe) emit at radio wavelengths (temperatures of 1-10 ºK). In the case of stars, hotter stars (temperatures of 104-106  K) have emission peaks in the ultraviolet range, w h e re a s c o o l e r s t a r s ’ (T<1000ºK) emissions are concentrated in the infrared. It is therefore essential to have the right tools to cap-ture this radiation.

The first telescope was the human eye. Since Galileo

built the first optical telescope in the 17th century, tele-scopes have expanded their col lect ing area so as to increase their capacity to detect ever fainter objects (known technically as sensi-t iv i ty) . The problem that astronomers found was that the atmosphere hindered astronomical observations in three ways: (i) it prevents the passage of radiation at cer-tain wavelengths; (ii) it distorts the image of astronomical objects, reducing the clarity wi th which they can be observed; and (iii) even in the visible part of the spectrum, it absorbs up to 20% of the light. Since the 1970s, space technology has made it pos-sible to put telescopes in space, allowing other ranges of the spectrum outside the

Antxon Alberdi

Has an honours degree in Physics from the University of Zaragoza and a Ph.D. in Physics from the Univer-sity of Granada. He has worked as a researcher at the Max Planck Institut für Radioastronomie (Bonn, Ger-many) and at LAEFF-INTA (Space astrophysics and fundamental physics laboratory). He is currently a CSIC research professor at the Instituto de Astrofísica de Andalucía (Andalusia astrophysics Institute, IAA-CSIC).

He is also a member of the CSIC’s Physical Science and Technology Area Commission. He has been dep-uty director of the IAA-CSIC and secretary of the Comisión Nacional de Astronomía (National Astronomy Commission). He has sat on the International Scientific Committee of the Canary Islands Observatories, the European Space Agency (ESA) IRSI-DARWIN mission, the “European Initiative for Interferometry (EII)” and was a member of the work group on “The future of the European VLBI Network” under the auspices of RA-DIONET as a part of FP7.

His research focuses on using radio astronomy techniques to study the central regions of the active nuclei of galaxies. He also uses these techniques to study the nuclear regions of starburst galaxies and to study the angular expansion of young radiosupernovae. He is currently taking interferometric observations in the close infrared range to study the structure of stars in the vicinity of the galactic centre and to determine orbits of multiple stellar systems.

Astronomy in the 21st-century is based on multifrequency analysis; that is to say, to understand the physics of a celestial object, we need to know about the light it emits across the whole spectrum

Antxon Alberdi.

48 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

03.3 … IN THE EXPLORATION OF THE UNIVERSE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

optical and radio ranges to be observed, and limiting dif-fraction, thus enabling more detailed observations (the ability to distinguish fine detail is known technically as angu-lar resolution). Radiowave interferometers were also developed in the 1970s allowing angular resolutions of around a 1000th of a sec-ond of arc to be obtained. In the 1980s segmented tele-scopes were developed, which have allowed giant tele-scopes to be bu i l t w i th diameters of up to 10 m such as GTC (Gran Telescopio de Canarias) or the planned future 30 metre EELT (Euro-pean Extremely Large Tele-scope), which is due to be built by the European Southern Observatory (ESO), at Cerro Armazones in Chile. Many of the instruments are equipped with adaptive optics, which eliminates the effect of the blurring of the image by the atmosphere and makes it possible to obtain the great-est possible sharpness.

The multi-range universeFrom the viewpoint of obser-vational astronomy, the ideal experiment would be one in which there was: (i) unlimited spectra l coverage ( f rom gamma rays /ve ry h igh-energy gamma rays through to the longest radio waves); (ii) unlimited spectral resolu-t ion (obta in ing the best

detail in the analysis of spec-tral lines); (iii) optimal angular resolution (studying astro-nomical objects with the greatest possible clarity; for which adaptive optics and interferometric techniques are used); ( iv ) maximum observational sensitivity (for which i t is necessary to increase collector surface); (v) polarimetric information from the four Stokes param-eters, and (vi) maximum cov-erage and temporal resolu-tion. Aiming for these goals is the path that astronomy is on, and progress has been linked to the major advances made in technology and instrumentation.

Multifrequency astronomy is now a reality. There is currently a range of instruments in space, including the Hubble space telescope (HST) oper-ating in the optical and ultra-violet spectrum, the Spitzer space telescope in the infra-red range, the XMM-Newton and Chandra observatories in the x-ray range, Fermi in gamma rays, and the Her-schel telescope in the far infrared and other submilli-metric wavelengths. Addi-tionally, there are space mis-s ions dedicated to very specific scientific objectives, such as COROT, Swi f t , Kepler, etc. Other missions are also in the pipeline, such as the James Webb Tele-

scope, which will make obser-vations in the infrared range. There are also missions dedi-cated to the explor ation of the solar system, such as Cassini-Huyggens, currently orbiting Saturn, and Rosetta, which is on its way to the Chiriumov-Gerasimenko comet, etc.

On Earth, we have giant tele-scopes with diameters of 8 to 10 metres, such as the GTC (Gran Telescopio de Canarias), and the VLT (Very Large Te lescope, which belongs to DSO), Keck, Gemini and Subaru, many of them incorporating adaptive optics to make observations at the limit of diffraction and with an arsenal of very high performance instruments available. Some of them can be combined interferometri-cally, such as in the case of the VLTI. Many telescopes in the 2 to 4 m c lass are engaged in mapping deep space, and, in fact, their future will be associated with legacy projects which will allow systematic sampling of astronomical sources. To look at gamma radiation we have the HESS telescope, in Namibia, and MAGIC, in Roque de los Muchachos (La Palma, Canary Islands). In radio waves we have the VLA (Very Large Array), eMERLIN (electronic MERLIN), and LOFAR (Low Frequency Array), very large base inter-

The major advances in astronomy have been linked to significant developments in terms of instruments and technologies

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 49

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE EXPLORATION OF THE UNIVERSE 03.3

ferometry networks, etc. The spectacular progress of radio-astronomy in recent years also stands out. In terms of sensitivity, since the pioneer-ing observations by Karl Jan-sky in 1930, there has been an improvement of 12 orders o f magni tude. S imi la r ly, angular resolution, with the systematic use of radio inter-ferometers, has improved by several orders of magnitude. At present, with very large base interferometry at 3 mm we can achieve an angular resolution of tens of micro-seconds of arc with an astro-metric precision that is even higher. And ALMA (Atacama Large Millimeter Array), which will work at millimetric and submillimetric wavelengths, is currently being commis-sioned. And for the future, SKA (Square K i l omet re Array), the large centimetric radio interferometer with a collector surface of a square kilometre and its precursor missions such as SKAP and MeerKAT, are planned.

The b ig quest ions in astronomyThe major advances made in astronomy have been linked to significant developments in terms of instruments and technologies. The large-scale installations available today or in the future arise from the need to answer scientif ic questions. What are these

big questions? By way of example, the key scientific questions set out by the ASTRONET expert commit-tee –an initiative created by various European funding bodies, such as the Euro-pean Southern Observatory and the European Space Agency, to define a roadmap for astronomy in the region over the next 15 to 20 years in a way that maintains lead-ership– when they defined the “Strategic Plan for Euro-pean Astronomy in the 21st century” are reproduced below. Four major areas of work were defined and for each of them the unanswered questions that need to be addressed during the period were posed, and the astro-nomical infrastructure neces-sary to answer them have been defined:

•  Do  we  understand  the extremes of the universe?

How did the universe begin? What are its basic constitu-ents? What are dark matter and energy? Can we observe extreme gravity in action? What mechanisms explain supernova explosions and gamma ray bursts? How are accretion disks, relativistic jets, supermassive central objects and magnetic fields in large cosmic accelerators interrelated? What can we learn about the universe from

high-energy radiation and cosmic rays?

•  How do galaxies form and evolve?

How did the universe emerge from the dark ages? How has the structure of the universe evolved? How has the uni-verse evolved chemically? What are the feedback mech-anisms between gas, and stars and dust in the galaxies? How did our galaxy, the Milky Way, form?

•  What is the origin and evo-lution of stars and planets?

How stars and star systems form? Is the initial mass func-tion universal (the number of stars that are born per unit of mass)? What is the feedback mechanism between the interstellar medium and the stars? How do planetary sys-tems form and evolve? What is the “demography” of the planets in our galaxy? Are there signs of life on exo-planets?

•  The Solar System. What is our place in the universe?

Do we understand the Sun’s fundamental physical proc-esses in detail? What is the dynamic history of the Solar System? What can we learn about the formation and evo-lution of the Solar System

In radioastronomy, since the pioneering observations by Karl Jansky in 1930, there has been an improvement in sensitivity of 12 orders of magnitude

50 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

03.3 … IN THE EXPLORATION OF THE UNIVERSE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

from in situ exploration? Where should we look for life in the Solar System?

Some of the large facilities It is impossible to make an inventory of al l the large astronomical facilities. Here I will describe just some of the facilities that are func-tioning in different ranges of the electromagnetic spec-trum and which are available to the Spanish astronomy community through a proc-ess of assessment of obser-vation proposals based on scientific merit.

The Gran Telescopio de Canarias (GTC)This is a segmented primary mirror telescope with a diam-

eter of 10.4 metres installed at the Roque de los Mucha-chos Observatory (Photo 1). This telescope is a Spanish initiative, led by the Instituto de Astrofísica de Canarias (IAC) with the participation of Mex ico and the Un i ted States. It has a variety of ambitious scientific objec-tives all based on the tele-scope’s enormous sensitivity. The f i rst instrument that came into operation was OSIRIS, capable of multi-object imaging and spectro-scopic analysis in the visible range. The f irst scientif ic results have now been pub-lished, which, combined with data from other telescopes, have permitted studies of magnetars, characterisation

Photo 1. Image of the Gran Telescopio Canarias (GTC), at the Roque de los Muchachos Observatory. It has a segmented primary mirror with a diameter of 10.4 m. Source: Photo courtesy of the author.

Part of the European Southern Observatory, VLT is European astronomy’s biggest piece of equipment. It is located on Cerro Paranal, at an altitude of 2,400 m

of exoplanets’ p lanetary atmospheres, the precise determination of the mass of a stellar black hole (five solar masses) in a binary x-ray stellar system, the character-isation of the reactivation of a quasar as the result of a galactic encounter or colli-sion, the study of the rupture of a s tar ’s photosphere through its spectral evolution and its transformation into the spectrum of a super-nova, among others. The second instrument, CANAR-ICAM (a camera and spec-trograph, with polarimetric and coronographic capaci-ties, in the thermal infrared range) wil l be coming on s t ream in the nex t few months.

Very Large Telescope (VLT)The VLT is European astron-omy’s biggest piece of equip-ment (Photo 2). It is located on Cerro Paranal, at an alti-tude of 2,400 m. It belongs to the European Southern Observatory (ESO), of which Spain is a member. It com-prises four 8.2 m diameter telescopes together with a further four auxiliary 1.8 metre telescopes. Additionally, the telescopes can be joined interferometrically, to achieve resolutions of tens of millisec-onds of arc. The 8.2 m tele-scopes have the best instru-mentation available, including broad field cameras, adaptive

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 51

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE EXPLORATION OF THE UNIVERSE 03.3

optic cameras, and standard and high resolution multi-object spectrographs. They cover a broad swathe of the electromagnetic spectrum, from deep ultraviolet (300 nm wave length ) th rough to medium infrared (24 microns).

The VLT has had a huge impact on observat ional astronomy. It has produced spectacular results such as the detai led study of the orbits of stars around a black hole in the centre of the gal-axy, the first direct image of an extrasolar planet, determin-ation of the expansion of the universe based on the study of type ( Ia) super-novae, detection of the oldest stars in the universe (with an age of 13.2 billion years) and confirmation of the collapse of a hypernova, an extremely massive star, as the origin of gamma ray bursts.

Atacama Large Mill imeter Array (ALMA)ALMA is an interferometer that when it is in full oper-ation towards 2013 will com-prise 66 antennas of which 54 will have a diameter of 12 metres and the remaining 12 wil l have a diameter of 7 metres (Photo 3). It works at submillimetric wavelengths (0.3 to 9.6 mm) which are the range in which gas and dust emit radiat ion. I t is being built in the Atacama

Desert (Chile) at an altitude of 5,000 m, a location which offers excellent conditions of transparency and atmos-pheric stability. It is currently at the commissioning stage and will produce its first sci-entific output in late 2011. ALMA wil l offer a unique combination of sensitivity, angular resolution, spectral resolution and image quality. It will be the ideal instrument with which to obtain images of galaxy, star and planet for-mation, both in continuous mode and based on emis-sion lines. It is believed that the early universe was mainly gas (basically hydrogen) from which the first stars formed

over the first hundred million years. It is also thought that these stars were extremely massive and luminous with short lifetimes and that after exploding as supernovae they enriched the interstellar medium with heavy chemical elements. ALMA will allow us to obtain information about these first stars from the emission of dust formed on their envelopes. Moreover, ALMA will be able to pene-trate molecular clouds where new stars are forming and chart the f i rst stages of their formation, their proto-planetary discs and even young planets. It will also provide extremely detai led infor-

Photo 2. Aerial view of the VLT Observatory (Very Large Telescope), European Southern Observatory (ESO), located in Cerro Paranal (Chile). It is possible to distinguish the 8.2 metre telescopes along with the 1.8 metre auxiliary telescopes. These telescopes have optical and infrared instrumentation that can operate interferometrically. Source: Photo courtesy of the author.

The Gran Telescopio de Canarias (GTC) a segmented primary mirror telescope with a diameter of 10.4 metres installed at the Roque de los Muchachos Observatory

52 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

03.3 … IN THE EXPLORATION OF THE UNIVERSE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

mation about the complex chemistry of the interstellar medium.

The futureThese are just some of the pieces of infrastructure that will be available to astron-omers over the next few years. There will also be the James Web Space Tele-scope (JWST), the natural successor to the Hubble Space Telescope, which will

work in the infrared and will have objectives complemen-tary to those of ALMA. There will be new X-ray satellites ( IXO, International X-Ray Observatory, still pending a final decision), which is due to replace XMM-Newton and CHANDRA. There are also x-ray interferometric projects such as MAXIM. It will also be an age of giant optical tele-scopes. In particular, ESO wi l l const ruct the EELT

(European Extremely Large Telescope), with a diameter of 30 metres. The future in radio wavelengths is SKA (Square Kilometre Array). Its enormous sensitivity, large field of view and angular reso-lution will represent a step forward in science in fields as diverse as the formation of the early galaxies, cosmic magnetism, extreme gravity, binary pulsar-stellar black-hole systems, representing

unique sources of gravita-tional waves.

The future is here. Spanish astronomers will have the opportunity to play an active part in it and access most of the large scientific infrastruc-tures on a competitive basis. We are also hopeful we will be able to join projects like SKA, where Spain has not decided whether or not to participate.

Photo 3. Artists impression of the ALMA interferometer (Atacama Large Millimeter Array) in the Atacama Desert (Chile). It works at (sub)millimetric wavelengths. It is currently at the commissioning stage, although the first call for proposals for real-time observation was run in the spring of 2011. Source: Illustration courtesy of the author.

04… in the

borderlands

56 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

04.1 … IN THE BORDERLANDS ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

While the 20th century will undoubtedly be remembered as the

century of physics, the cur-rent century may go down in history as the century of biol-ogy. Over the last few dec-ades, the rapid expansion of sc ient i f i c and techn ica l resources and efforts, the impact of the latest discover-ies and their socio-economic implications, have put biology at the centre of the scientific arena.

The present and future of synthetic biology

Javier Macía and Ricard Solé

Universitat Pompeu Fabra

Synthetic biology can be understood to mean designing and making biological components and systems that do not exist in nature. The term also encompasses techniques enabling modifications to be made to the design of existing biological systems. The ultimate goal is the creation of new organisms able to respond to specific stimuli in a programmed, controlled and reliable way.

Progress in areas such as the study of cancer or the brain promise to change many of our ideas, bringing us closer to understanding the origin of previously incur-able diseases, and allowing treatments to be developed fo r them. Much o f th i s progress is the result of improved experimental tech-niques, but the development of theoretical models and computer simulations have also played a part. Thanks to

c o l l a b o r a t i o n b e t w e e n researchers from different disciplines, it has been pos-sible for biology to adopt a quantitative vision closer to that of physics. This meeting of different disciplines that were once remote from one another has led to the devel-opment of so-called systems biology, a discipline in which biological complexity is con-sidered in terms of interact-ing systems. This represents a shift away from the reduc-

tionist paradigm that domi-nated the 20th century, wh ich took a p r ima r i l y molecu lar v iew of l i v ing organisms.

In the experimental field, the development of genetic engi-neering and the abil ity to manipulate not just reactions or specific components, but complete complex cellular processes, has paved the way for a new discipline –synthetic biology– which will

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 57

|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE BORDERLANDS 04.1

undoubtedly change our understanding of life and its limits. Synthetic biology can be understood to mean design-ing and making biologic al components and systems that do not exist in nature. The term also encompasses techniques enabling modifi-cat ions to be made the design of existing biological systems. The ultimate goal is the creation of new organ-isms able to respond to spe-cific stimuli in a programmed, controlled and reliable way. This is made possible by introducing DNA sequences that code for new genes, often originating in species different from that being manipulated. Thus, bacterial cells are often genetically modified by adding foreign genes, be they from viruses, other bacteria, or humans. These new genes and the new genet ic regulat ions

associated with them are able to induce new behav-iours (functions) in the recipi-ent cells, enabling them to be reprogrammed.

Biomedicine will undoubtedly be one of the areas that will benef i t most f rom these advances in fields such as gene the rapy o r t i ssue regeneration. It may even make it possible to obtain so-called smart drugs, com-prising a synthetic envelope containing a diagnostic mol-ecule able to detect pathol-ogy indicators and decide whether or not to release the drug it contains. However, the range of possible appli-cations is limited only by the imagination. Table I shows a list of some of the potential appl icat ions of synthet ic biology on which scientists a re c u r re n t l y w o r k i n g . Achievements such as bio-

fuels (hydrogen or ethanol), the efficient conversion of wastes into bio-energy or the use of modified bacteria and fungi able to eliminate t o x i c c o m p o u n d s a n d decontaminate ecosystems are already a reality.

The impressive progress made in the field of synthetic biology over the last decade has largely been the result of apply ing an engineer ing approach to the non-engi-neering context of a bio-logical system. Conceiving of genetic circuits as arrays of switches which turn gene expression on and off has made it possible to develop devices such as bistable cir-cuits (i.e. circuits that can be in one of two possible states, like a light bulb that is either on or off) or oscillator circuits that seem at first sight to belong more to electronics than to biology. To date, cir-cuits have been created that are able to make non-trivial decisions depending on the signals they receive from the environment by following a p rede f i ned p rog ram, something that can, clearly be defined as computing. Table II lists some of the most significant advances made in the last decade. Many of these circuits have been inspired by designs from electronics, despite their using cellular components,

Javier Macía

Has a PhD in Physics from the University of Barcelona. He is currently a lectur-er in the Department of Experimental and Health Sciences at the Pompeu Fab-ra University (Barcelona) and is working in the Complex Systems laboratory led by Dr Ricard Solé.

He is active in a diverse range of research fields within the sphere of synthetic biol-ogy and systems. In particular, his research focuses on unconventional com-putation systems, particularly in living systems, from the viewpoint of theory and experimental implementation. He is also working on the study of collective be-haviours synthetically induced in bacteria by means of genetic modifications. In the last few years he has taken part in European projects devoted to the devel-opment of the first synthetic proto-cell (PACE project) and computational circuits made up of genetically modified cells.

Progress in areas such as the study of cancer or the brain promise to change many of our ideas, bringing us closer to understanding the origin of previously incurable diseases, and allowing treatments to be developed for them

Javier Macía.

58 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

04.1 … IN THE BORDERLANDS ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

such as genes or proteins, ra ther than t rans is tors . Although this inspiration in Electronics has enabled sub-stantial advances to be made rapidly, it can also become a bot t leneck, l im i t ing the increased complexity of bio-logical devices.

Living systems have unique characteristics not found in electronic systems, forcing us to think differently about how we create biodevices. Several factors currently limit the development of more complex devices. These include, in particular: a lack of knowledge about many of the building blocks (genes, proteins, etc.) used; the fact that introducing the compon-ents that make up the cir-cuits in a cell makes their behaviour unpredictable, for example because of uncon-trolled interactions with other parts of the cell; possible

the cellular context is every-thing is mixed and therefore each “connection” has to be made using a different bio-chemical element. This con-siderably limits the size of the circuits that can be built. In order to overcome these dif-ficulties, several research groups have proposed new approaches to the problem

Ricard Solé

Has a Ph.D. in Physics from the Catalonia Polytechnic University. He is a lectur-er at the Pompeu Fabra University where he heads the complex system labora-tory. His research in this field ranges from theoretical ecology to the study of so-cial networks, language networks or networks relating to complex systems such as traffic or the Internet. He is an external professor of the Santa Fe Institute, sen-ior member of the Astrobiology Centre, associated with NASA, and an adviser to the European Complex Systems Society.

In 2003 his research, in collaboration with Ramon Ferre, earned him the Ciutat de Barcelona Scientific Research Prize for his paper “Least effort and the origins of scaling in human language,” published in 2003 in the US journal Proceedings of the National Academy of Sciences.

Field of application Description

Smart drugs

A smart drug consists of an envelope containing a drug and a molecular mechanism to detect a pathology. When the pathology is detected, the drug is released into the organism. A possible example of this technology would be drugs designed to detect changes in hormone levels, and respond by secreting certain molecules or synthesising certain proteins.

Gene therapyThis consists of the design of biological circuits that detect abnormal physiological changes in cells and respond to correct the anomaly or induce the elimination of the abnormal cells. The most obvious application is in cancer treatment.

Tissue repair and regeneration

This application is based on systems comprising sensors able to recognise the presence of damage in certain tissues, combined with a group of enzymes able to repair the damage

BiorremediationThis is based on the use of genetically modified bacteria and fungi able to eliminate toxic compounds and decontaminate ecosystems.

BiosensorsThese are analytical devices that comprise an element able to recognise and interact with substances or microorganisms of interest and an electronic system which allows the signal produced by this interaction to be processed.

EnergyThere are three main fields of research into the production of bioenergy using genetically modified microorganisms. These concern organisms able to produce hydrogen or ethanol, convert waste into energy or turn solar energy into hydrogen.

/// Table I. Potential applications of synthetic biolog ////////////////////////////////////////////////////////

incompatibilities between dif-ferent parts that may come from different species, or finally, the problem of con-necting the different parts to one another. Precisely this latter problem is one of the biggest challenges that has yet to be overcome. Whereas in electronic circuits every wire is physically isolated, in

of enabling circuit complexity to be increased without exacerbating the associated problems. The distribution of circuit components across different coexisting cell types seems to be one of the most promising avenues. This would mean biological cir-cuits would not be located in cells of just one type but distributed across several cell types such that, although they are unable to perform the planned task individually, when put together they act in unison to achieve the desired goal. The distribution of tasks between different cell types makes it possible to greatly reduce the genetic manipula-tion of each cell necessary, thereby minimising the prob-lems mentioned and opening the way for the construction of more complex circuits.

Ricard Solé.

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 59

|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE BORDERLANDS 04.1

world are using this collec-tion of parts in their work, while also contributing new approaches.

However, beyond the current scientific and technical limita-tions, and the need for new approaches not inspired in today’s electronics, it is clear that the question of how far it will be possible to go in this new f ie ld has yet to be answered. Today’s specula-tion could be tomorrow’s reality. The answer is still a mystery. The physicist Free-man Dyson has correctly pointed out that the emer-gence of synthetic biology

Device Year Publication

Bistable device: that acts like a memory element able to store two different states.

2000“Construction of a genetic toggle switch in Escherichia col?. Gardner et al. Nature 403, 339-342

Oscillator: generates an output signal (synthesis of a protein) whose concentration changes in an oscillating way over time. 2000

“A synthetic oscillatory network of transcriptional regulators”. Elowitz et al. Nature 403, 335-338

Population control: circuit that regulates the number of bacteria growing in culture. When the number exceeds a certain threshold, death of the excess cells is triggered to regain the desired level.

2004“Programmed population control by cell-cell communication”. You et al. Nature 428, 868-871

Band-pass circuit: a system consisting of a population of modified bacteria able to detect the presence of a particular molecule if its concentration lies between two previously established (high and low) levels.

2005“A synthetic multicellular system for programmed pattern formation”. Basu et al. Nature 434, 1130-1134

Binary computational circuit: circuit programmed to respond to different combinations of external signals by expressing a given protein (or not). This is a computational circuit that is conceptually identical to the electronic circuits found in computers.

2007“A universal RNAi-based logic evaluator that operates in mammalian cells”. Rinaudo et al. Nature Biotechnology 25, 795 - 801

Distributed binary computational circuit: circuits able to implement complex binary functions, such that the circuit is distributed across different cell types.

2011

“Distributed biological computation with multicellular engineered Networks”. Regot et al. Nature 469, 207-211

“Robust multicellular computing using genetically encoded NOR gates and chemical ‘wires’. Tamsir et al. Nature 469, 212-21

/// Table II. Some of the most important advances of the last decade //////////////////////////////////////////////////////////////////////////////

The combinat ion of new design principles and the effort being made to stand-ardise biocompatible parts may make it possible to over-come the current limitations in the near future. In this con-text it is worth mentioning in particular the MIT’s (Massa-chusetts Institute of technol-ogy) initiative known as the Registry of Parts. This is a collection of hundreds of bio-logical parts that can be the assembled following a stand-ard procedure to create new devices. In an example of international scientific cooper-ation, numerous groups of researchers from around the

marks the end of Darwinian evolution as we know it. By intervening in natural regula-tory mechanisms, we can tap into a whole universe of pos-sibi l i t ies, many of which would never have been reached by natural evolution. For this same reason, we may perhaps be able to solve problems that have so far been intractable.

Perhaps it is too early to be able to grasp the full poten-tial of this new discipline. Looking back to the field of electronics as a historical ref-erence it is worth remember-ing that from the develop-

ment of the first transistor in 1948 through to the creation of the first commercial micro-electronic devices in the early 1970s took more than 20 years. It is therefore very likely that we will have to wait 10 or 15 years for syn-thetic biology to reach a suf-ficient level of development for its direct application to fields such as biomedicine or environmental bioremedia-tion. What is certain, how-ever, is that we are witness-ing and actively participating in one of the scientific and technological revolut ions that will shape the course of future knowledge.

60 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

04.2 … IN THE BORDERLANDS ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

Today’s society is often dubbed the communi-cations society, due to

the importance of mass media and communications in our lives. Today, we can access detailed information on any topic from anywhere on the world via our compu-ter or even on our mobile phone, something which was unimaginable just a few years ago. However, what are the rules that govern the informa-tion protocols we use? For example, suppose you want to compress an image to send it to a friend. Everyone knows that the degree of c o m p r e s s i o n p o s s i b l e depends on the image’s complexity: a photograph of

Quantum information

Antonio Acín

Instituto de Ciencias Fotónicas (ICFO)

Antonio Acín’s article offers an introduction to the theory of quantum information: a theory that breaks new ground in the way we process information. Some of its main outcomes could be more powerful computers and simulators or secure cryptography schemes.

a blue sky can be com-pressed more than an image of a busy city street, say. Similarly, we know that for any image there will be a limit to its compression, and that this will not be arbitrary. Can we define this limit? More generally, is there a theory that describes the processes of transmission and process-ing of information and, for example, establ ishes the minimum number of bits needed to represent a given image? The answer to this question is yes, and the answers are given to us by Information Theory.

The foundations of informa-t ion theory were la id by

Claude Shannon at the tail end of the 1940s. In a series of seminal papers he intro-duced the concept of what came to be known as “Shan-non entropy” and demon-strated how informat ion could be sent over noisy channels by calculating the optimal information trans-mission rate for each chan-nel. Since then, the theory has grown and developed in parallel with the consolida-tion of the communication society. One of the great vir-tues of theory is its math-ematical abstraction. Take the unit of information, the bit. This can be represented by any system that can have two levels. For example,

imagine a coin: a binary zero can be assoc ia ted wi th heads and a one with tails. We can then say that the coin provides a physical medium for this bit of infor-mation, or alternatively, that this bit has been coded with the coin. Leaving the clear practical disadvantages to one side, all today’s commu-nications could be carried out in an equivalent way with coins. And, of course, the same reasoning applies to any other mode of encoding information that comes to mind: from the point of view of information and the appli-cations using it, the physical hardware on which i t is stored is irrelevant.

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 61

|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE BORDERLANDS 04.2

In the early 1980s, a group of researchers interested in information theory pondered the following scenario: if the technological progress being made on the miniaturisation of the devices used to trans-mit and process information were to continue, in the near future we would be able to code and store information on atomic particles at the microscopic scale. However, it is well-known that the laws governing the world on our sca le , the macroscop ic world, are not the same as those on the microscopic level. Newtonian physics, based on Newton’s equa-tions of motion, reflect fault-lessly what happens on the scale of the world we see around us. Nevertheless these laws are unable to pre-dict phenomena that occur at the microscopic scale, for example, the interactions between atoms and pho-tons. It was, in fact, this fail-ure of Newtonian physics when describing a series of experiments in the late 19th century that led to the birth of quantum physics, a new theory of physics which could give a satisfactory explanation for these phe-nomena. The question that arose around 1980 was: will coding information on par-ticles governed by different physical laws entail a change in the way we transmit and

process information? Can the quantum laws, which represented a revolution in our understanding of nature on the microscopic scale, also change the theory of information? At first sight, we would expect the answer to this question to be no: infor-mation theory is an abstract mathematical theory which has little or nothing to do with physics. However, this first intuition turns out to be fa lse: surpr is ingly, when informat ion is coded on quantum particles, new pos-sibil it ies and applications arise that do not have ana-logues in the information theory previously developed. The quantum laws open up a new range of possibilities to explore and this is precisely the aim of the new theory: to provide the principles that

in place, why was the application of quantum formalism to com-munication and information transformation not considered earlier?

The answer to this question lies in technology. In our daily lives we have all seen the impressive and unstoppable p rogress o f techno logy towards the miniaturisation of devices. With a small lap-top we can do everything today that not so many years ago would have required a mainframe. Not to mention the storage capacity of USB drives or the latest gener-ation mobile phones. The most graphic way to repre-sent this progress is with Moore’s Law (see Figure 1). In fact, it is not really a law, but rather an observation made by George Moore, one

Antonio Acín

Is a telecommunications engineer, having qualified at the Universitat Politècni-ca de Catalunya, and has a degree in physics from the University of Barcelona, where he also took a PhD in physics. His postdoctoral training included a study visit at the applied physics group at the University of Geneva and a period at the ICFO Institute of photonic sciences.

He is currently ICREA professor in the ICFO, where he heads the Quantum Infor-mation Theory group. In 2008 he was awarded a Starting Grant by the European research Council (ERC) and in 2010, jointly with Stefano Pironio and Serge Mas-sar from the Brussels Free University, he won a prize from the French journal La Recherche for the best paper that year with the participation of a Francophone in-stitution. His research focuses in particular on quantum information theory, a dis-cipline that studies how quantum phenomena may be used to design new ways of processing and transmitting information. His research also covers aspects of fundamentals of quantum physics, quantum optics, statistical physics, and con-densed matter physics.

govern the processing and transmission of information when it is coded on quantum particles.

Why then?As mentioned, quantum infor-mation theory represents a paradigm shift in the way we think about information: the physical laws of the devices we use to store and process information have a crucial influence on the tasks that can be performed. However, why did this scenario begin to be considered in the 1980s and not soon after Shannon’s work in the 1950s? After all, quantum physics had already established its ascendancy had been formalised rigor-ously by the work of John von Neumann. If the ingredients to combine information and quantum physics were already

Antonio Acín.

62 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

04.2 … IN THE BORDERLANDS ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

of the founders of Intel. When he studied how devices were evolving over time, he noted that they were getting smaller at an exponential rate, halv-ing in size every 18 months. If this exponential rate of improvement continued, a trend which has been sus-tained to date despite minor corrections, we will soon be coding information on atomic particles. This prediction of the future was the main moti-vation spurring researchers to consider quantum sce-nario for the transmission and processing of informa-

tion thus leading to the emer-gence of quantum informa-tion theory.

Results obtainedSince its birth in the 1980s, quantum information theory has become established as an emerging scientific field. It is highly interdisciplinary and is attracting attention from a range of different commu-nities, including mathematicians and theoretical physicists along with experimental phys-icists and engineers. A multi-tude of results have appeared in recent years and it is difficult

here to give a detailed sum-mary of all of them. However, there is certainly a widespread consensus in the community about the most important applications of the field today: quantum computing, quan-tum simulation, and quantum cryptography.

One of the key outcomes it has produced is the possibil-ity of a quantum computer with a computational power way beyond that of a conven-tional computer. It should be emphasised that we are referring to an exponential

difference: there are prob-lems which a computer based on classical physics would take years to solve, whereas a quantum compu-ter would be able to solve them in hours. Although it is possibly too simplistic, we could say that a quantum computer would be the equivalent in a quantum set-ting to a computer of the type in use today computer: it is a device that is able to prepare an arbitrary initial state, per-form operations on it and read out the results obtained. That is to say, doing the same things as a conventional computer today, but on a quantum medium, compris-ing, for example, a series of atoms on which operations can be performed in a con-trolled manner. At present, it is not clear in what situations a quantum computer would yield a significant advantage over its traditional equivalent. Understanding which prob-lems are simple and which are complex for a quantum computer and how these results compare with those of a conventional computer, is one of the key goals of quan-tum computation, a sub- discipline of quantum infor-mation theory.

The second potential appli-cation that should be men-tioned is quantum simulation. A common problem in many

/// Figure 1. Moore’s Law ////////////////////////////////////////////////////////////////////////////////////

Tran

sist

or s

ize

(mic

rons

)

Year

100

10

1

0.1

0.01

0.0011970 1980 1990 2000 2010 2020 2030

Diameter of a hydrogen atom

X-ray lithography

The graph shows the evolution over time of the size of the transistors used to process information. Assuming that the technology will be able to keep up this rate of development, we will soon be close to storing information on atomic particles. At this scale, particles’ behaviour is described by quantum physics.

Source: Image courtesy of the author

Optical lithography

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 63

|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE BORDERLANDS 04.2

scientific studies, not just in physics, is that the simulation of many particles requires considerable computational power. It is well known that a great deal of research activity in any discipl ine involves computer simulations: for example, a theoretical model may exist which we believe may reproduce experimental results, so it is simulated on a conventional computer to confirm whether this intuition is correct. However, the ubiquity of quantum physics in any field of science means there are many problems which require the simulation of many-particle quantum systems, something which is

extremely complex to do on computers based on clas-sical physics. The idea of a quantum simulator is to build an ad hoc quantum system that can be controlled, in order to be able to simulate other unknown quantum sys-tems. In this case, the appli-cations are more focused on science, as having a good quantum system simulator would be a fundamental step forward in fields such as con-densed matter physics or quantum chemistry. More-over, it is clear that subse-quent practical applications o f the sc ient i f ic resu l ts obtained, as a consequence of simulation of complex

quantum systems, could be very significant.

The last in the list of applica-tions is quantum cryptog-raphy: using information coded in quantum states, two trusted parties can send information in a totally secure way. Indeed, it can be shown that any attempt by an adver-sary to read this information would be detected and com-munication, its security com-promised, aborted. Quantum cryptography is one of the most powerful ideas to have emerged from the field and clearly exemplifies its virtues and the paradigm shift that has taken place. The Heisen-berg uncertainty principle is well known, demonstrating that an observer, when trying to measure the state of a quantum particle, modifies it. This result is usually pre-sented as one of the undesir-able consequences of quan-t u m t h e o r y. Q u a n t u m cryptography turns this on its head. Any experiment con-ducted in any laboratory round the world can show that quantum physics is right and therefore the uncertainty principle applies. Therefore, we should take advantage of i t to encrypt information securely! Quantum cryptog-raphy simply exploits this principle: if an anniversary tries to read the information coded in a quantum state,

One of the key outcomes it has produced is the possibility of a quantum computer with a computational power way beyond that of a conventional computer

Image 1. Information on atomic particles. The main goal of quantum information theory is to understand how to manipulate and process information stored on quantum particles. New applications are possible thanks to the special properties of these particles, which have no analogue in our microscopic world described by Newtonian physics. Source: ICFO.

64 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

04.2 … IN THE BORDERLANDS ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

their intervent ion wi l l be detected. What, until recently, was seen in a negative light is now a key factor with an application of clear practical interest.

The challengesHaving reached this point the natural question to ask is, if all these results are so promising, why do we not already have devices based on these ideas? Why do we not yet have a quantum computer? The drawback is technological: many of the app l i ca t i ons desc r i bed requ i re very demanding technical know-how, cur-rently only available to some of the best experimental physics laboratories in the world. The reason is that it is

themselves in our reality, as to do so we would need to control very precisely all the possible interactions with the environment.

These considerations play an important role in the case of a quantum computer : the degree of experimental con-trol over microscopic par-ticles needed to build a quan-tum computer is currently u n t h i n k a b l e , a n d w i l l undoubtedly take several years to obtain. Quantum simulation, however, without going into details, is some-what less demanding in this regard. It is l ikely that in the next few years the first s imulators wi l l be bu i l t , based, for example, on atoms trapped in optical lat-tices, able to simulate sys-tems that cannot be repre-sented even with the best of today’s or tomorrow’s super-computers. If this happens, the results would undoubt-edly be revolutionary. Finally, quantum cryptography is, by a long way, the least demand-ing application from the tech-nological point of view: And it is already on the market. There are currently com-panies marketing quantum cryptography devices. It is there fo re a much more mature technology. Indeed, various European universities a n d re s e a rc h c e n t re s , together with the European

space agency (ESA), have teamed up to put a photon source on one of the mod-ules of the Columbus space-craft. This source would make it possible to develop quan-tum cryptography protocols for long-distance satell ite communications between two trusted parties (see Image 2).

Concluding remarksThis article has presented an introduct ion to quantum information theory. This is a theory that breaks new ground in the way we pro-cess information. Its most sig-nif icant outcomes wil l be more powerful computers and simulators and secure cryp-tography systems. Before concluding, it is worth con-sidering the following points. Firstly, apart from the ques-tion of whether the quantum computer can or cannot be built, the field has led to a revolut ion in our way of understanding information. Information is physical in nature, given that, as men-tioned, the laws of physics strongly influence how it can be manipulated. The second consideration is that, sooner or later, information technol-ogies will reach a microscopic world and we need to think about how quantum laws can be used in the new scenario. Why not do this now? Can there any doubt that this will happen soon?

necessary to manipulate a large number of atomic particles in a highly controlled way. This is necessary in order to (i) allow particles to interact with one another, but (ii) ensure they do not interact with their environment. Indeed, uncon-trolled interaction with the environment, also known as “decoherence,” is the great practical enemy of the appli-cations of quantum informa-t ion theory. These inter-actions cancel out the particle’s quantum properties, making it impossible to obtain results beyond those that could be achieved with classical phys-ics. Decoherence is precisely the reason that our world is not quantum: al l the fre-quently surprising quantum phenomena do not manifest

Image 2. Quantum satellite communications. Several European universities and research centres are working with the European Space Agency on a project to make quantum satellite communications a reality. The goal is to place a source of photon pairs on a satellite to allow quantum cryptography protocols to be developed for use over long distances. Source: ICFO.

Signal transmission

Secure communication

66 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

04.3 … IN THE BORDERLANDS ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

Research over last five years looking at the effect of biological dif-

ferences between people has produced some interesting results. These describe how certain physiological differ-ences between individuals can affect complex decision-making. The idea is not new. It has long been known that physiological variables (such as hormone levels) influence our behaviour. But, in general, what we know is based on correlations. These offer only

When economics met physiology How do living creatures make decisions? This is an interesting question, particularly in the case of human beings. Various disciplines have addressed the issue, looking for what makes us different from one another, to account for such a variety of responses to similar situations. Traditionally, these differences have been investigated from a psychological or sociocultural perspective, without taking into account biological differences between individuals.

Enrique Turiégano

Universidad Autónoma de Madrid (UAM)

weak support for causal rela-tionships between the variable and behaviour. Conducting experiments under standard conditions does not guaran-tee that the relationships iden-tified are causal, but makes it more plausible.

Exper imental economics offers a toolkit of experimental methods including a series of simple strategy games that have been thoroughly studied by economists. These games are ideally suited to conduct-

ing studies of this kind as they embody simplifications of common social situations, reproduced under laboratory conditions, in which the mater-ial benefits that subjects obtain from their decisions also depend on the decisions of others. This yields results that are both readily quantifi-able and replicable. Perhaps the best known of these games is the pr isoner’s dilemma, which demonstrates individuals’ tendency to co- operate, where cooperation is

understood as making a deci-sion that maximises collective welfare without necessarily maximising the individual’s own welfare. The prisoner’s dilemma involves two players who have to decide, without coordination (they are unable to communicate with one another), whether or not to cooperate. The payment each receives depends on both his or her own decision and that made by the other player. Economic theory predicts that players would never cooper-

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 67

|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE BORDERLANDS 04.3

ate, even though the individual reward would be greater if they did. However, in experi-ments, humans cooperate more often than the traditional view predicts.

Research in this area has also used three other games: the dictator game, the ultimatum game, and the trust game. The first is very simple: a player is given a sum of money and has the option of sharing it with a second (unknown) person, without any type of response from the second player. This is a direct way of measuring altruism.

The ultimatum game is similar, but the second individual can decide whether to accept the share offered by the first player

or reject it. If the second player rejects it, neither player gets any of the money. This game makes it possible to measure the tendency of the second player to punish what he/she considers an unfair offer (at the expense of his own bene-fit). Rational behaviour would be to accept any positive amount. But human beings, like other animals such as chimpanzees, tend to reject offers of less than 20% of the initial amount.

The trust game also has two players, A and B. Player A is given an initial sum of money. From this initial sum A decides how much to share with player B (payment a). This amount is tripled before being given to B. Of this money, B

can decide to give something back to A (payment β). Pay-ment a reflects trust (A’s trust in B) and payment β reflects B’s honesty.

Let’s look at some of the examples of results obtained using experimental econom-ics protocols to study the effects of some of the physio-logical variables on our behav-iour. Or, putting it another way, examples of the resul ts obtained by using physiology to explain the discrepancies between theoretical predic-tions and the f indings of experimental economics. These findings are an example of the new insights that can be gained by applying tools from outside the discipline.

Oxytocin and vasopressinOxytocin and vasopressin are hormones involved in regulat-ing labour and diuresis, respectively. Recently they have been implicated in the formation of monogamous and stable relationships. In mammals, oxytocin is associ-

In mammals, oxytocin is associated with reducing social stress, thus facilitating bonding between individuals. Meanwhile, vasopressin is implicated in the development of protective behaviours in males, relating to their defending their partner and offspring

Enrique Turiégano

Has a Ph.D. in biology from the Madrid Autonomous University (UAM). He com-pleted his doctoral thesis in the laboratory led by Dr Inmaculada Canal at the UAM, researching decision-making linked to sexual selection in de D. mel-anogaster. He obtained his post-doctoral training at the Department of Eco-nomics of the University of Edinburgh, where his supervisor was Dr Santiago Sánchez-Pages. His research focuses on analysing the extent to which physio-logically-based variables such as present and past levels of hormones or sym-metry are associated with how individuals behave in various economic games. He is currently an assistant lecturer in the Biology Department of the UAM.

Enrique Turiégano.

68 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

04.3 … IN THE BORDERLANDS ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

ated with reducing social stress, thus facilitating bond-ing between indiv iduals. Meanwhile, vasopressin is implicated in the development of protective behaviours in males, relating to their defend-ing their partner and offspring.

These functions were demon-strated convincingly in experi-ments with a species of the American vole (Microtus genus), some of which are monogamous and others polygamous. The females of the monogamous species which are given oxytocin form strong bonds with unknown males. Also, supplying vaso-pressin to monogamous males causes them to develop typical masculine protective behaviour with unknown females. Even more surpris-ingly, reproducing the pattern of expression of the vaso-press in receptor o f the monogamous vole (M. ochro-gaster) in a polygamous male vole (M. montanus ) turns these polygamous individuals monogamous. This suggests the pattern of expression of the receptor in the brain deter-mines this complex behav-ioural trait. This inevitably raises the question whether this pattern explains the differ-ences between individuals.

It seems it does. Individuals naturally tend to have different patterns of expression. Differ-

rest of the population in the dictator game.

The direct effects of these neuropeptides on human behaviour have also been tested, by administering them to experimental subjects via a nasal spray. The results obtained with oxytocin are the most interesting. In the trust game, individuals playing the role of player A who received oxytocin showed greater trust in the possible investor (player B), giving them a larger sum of money. The interesting thing is that this increase did not hap-pen if they were told that the return would be decided ran-domly rather than by a human. This shows the effects of the hormone to be specific to social interactions. The same was found in both the ultima-tum game and the dictator game. Administering oxytocin increased the sum of money offered to the recipient individ-ual in the ultimatum game but did not do so in the dictator game, where there was no possibility of reciprocation and, therefore, participants do not try to consider what feel-ings their offer will cause the second individual. It seems that oxytocin increases parti-cipants’ empathy towards the other player. This empathic feeling also means that in the trust game there is an endog-enous increase in oxytocin in individual B when individual A

has trusted him or her (corre-lating with a bigger β pay-ment, that is to say B gives a more honest response).

TestosteroneTestosterone is a steroid hor-mone that, as well as having multiple effects on develop-ment and reproductive physi-o logy, promotes var ious behaviours in men centred on enhancing their status. In the species studied, testosterone levels are associated with indi-ces of well-being, physical capacity and reproductive success.

The best known and most widespread effects of testos-terone on behaviour are increased levels of aggres-sion. This idea is based on empirical results obtained f rom both rodents and humans, and while not entirely inaccurate, does not give the whole picture. Experiments in which the hormone was administered to rats and mice h a v e c e r t a i n l y s h o w n increased aggression, but as these are not social species, direct aggression is the only way the male has to achieve dominance. Moreover, a strong correlation has been found between levels of aggression and testosterone in various human populations (boarding schools, prisons, etc.). However, aggression in these contexts could also be

Administering oxytocin increased the sum of money proposers offered the other player in the ultimatum game but did not do so in the dictator game, where there was no possibility of reciprocation and, therefore, participants do not take into account the impact of their offer on the other player’s feelings

ent alleles of the V1a receptor (one of the vasopressin recep-tors) cause its carr ier to present different patterns of expression that affect their behaviour. In terms of their responses in the games just described, individuals with at least one of these alleles tend to be less altruistic than the

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 69

|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE BORDERLANDS 04.3

the only mechanism for raising status. Nevertheless, there is also evidence against an unambiguous relationship between testosterone and aggression. For example, administering the hormone to human beings does not always increase aggression or competitiveness (although if subjects are given a placebo and told that they have been administered testosterone, their levels of aggression and competitiveness do rise). This supports the a l ternat ive hypothesis, that testosterone promotes behaviours focusing on enhancing the individual’s status. Sometimes this results in direct competition, but at others it can mean showing signs of philanthropy (incur-ring losses to benefit others).

The effect of testosterone on behaviour has been analysed on numerous occasions, look-ing at its effect on players in both roles in the ultimatum game. In this game, male sub-jects who reject low offers tend to have higher levels of testosterone. That is to say, they punish participants eco-nomically if they make an offer they consider unfair, even though this is at the expense of their own benefits (see Fig-ure 1). As regards the effect on the proposer of the offer, experiments in which the hor-mone was administered to participants have produced

contradictory results. In some cases men to whom the hor-mone was administered made bigger offers. But in others they made smaller ones. Indeed, in some cases, administering the hormone has been found to have no effect. This absence of a clear effect of testosterone has sometimes been detected in the three other games (dicta-tor game, trust game, and the prisoner’s dilemma). Adminis-tering testosterone produces such varied results because the adaptive regulation of

hormone levels needs to be taken into account (the so-called challenge hypothesis described some time ago in birds). This regulation means a potentially competitive situ-ation may or may not involve an increase in the hormone, depending, among other things, on the perceived sta-tus of the adversary. That is to say, testosterone levels only increase when an interaction is envisaged that, if the out-come is successful, leads to improved status. The inter-action may or may not involve

aggression. This hypothesis is supported by the results dis-cussed below.

Testosterone is essential to male development, being responsible for masculinisa-tion in embryonic develop-ment and adolescence. In addition to the effects that we are all familiar with, the levels of testosterone in these periods affect the con-figuration of the individual’s nervous system, and conse-quently, their behaviour as adults. The problem is that it

/// Figure 1 /////////////////////////////////////////////////////////////////////////////////////////////////////////

Low

offe

r

2D:4D Testosterone

Aver

age

offe

r

AcceptedRejected

0.9700.9680.9660.9640.9620.9600.9580.9560.9540.9520.950

AcceptedRejected

0.100

0.075

0.085

0.095

0.080

0.090

0.070AcceptedRejected

150

145

140

135

130

125

120

115

110

AcceptedRejected

150

145

140

135

130

125

120AcceptedRejected

0.970

0.965

0.960

0.955

0.950

0.945

Masculinity

AcceptedRejected

0.096

0.094

0.082

0.086

0.092

0.090

0.084

0.088

0.080

The image shows the levels of a number of different variables in relation to testosterone in terms of the response given (acceptance or rejection) to low offers (15% of the initial amount) and average offers (30% of the initial amount) made to male players in the ultimatum game.

Source: Illustration courtesy of the author

70 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

04.3 … IN THE BORDERLANDS ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

is difficult (except with a very long-term longitudinal study) to relate these real measure-ments of testosterone in newborns or adolescents to their subsequent behaviour as adults. What researchers can do, however, is study this effect through variables that correlate with testoster-one levels in both periods. One such variable is the digit ratio, i.e. the ratio between the lengths of the second and fourth finger (2D:4D), and another facial masculin-ity. In a study on pregnant women the 2D:4D ratio cor-

re lated wi th the re lat ive amount of testosterone to w h i c h t h e f o e t u s w a s exposed. The 2D:4D ratio is higher in women and a low value in men correlates with typically masculine traits, such as increased spatial reasoning ability or greater competitiveness. The degree o f f ac i a l mascu l i n i t y i s determined in adolescence by the level of testosterone the individual secretes. This also has an effect on behav-iour: individuals with a more masculine face take more risks.

The results obtained in stra-tegic games with these vari-ables corroborate what I have just described for tes-tosterone concentration. For example, in the ultimatum game more masculine men (in terms of both 2D:4D ratio and facial characteristics) tend to reject lower offers (see Figure 1). Moreover, the results obtained with these variables support the two novel predict ions of the effect of testosterone in humans alluded to above. First, testosterone does not necessarily trigger aggres-

sive competition, but can be re lated to ra is ing status through philanthropy. More mascul ine ind iv idua ls in terms of 2D:4D make more generous offers in the dicta-tor game and tend to be more cooperative than com-petitive in cooperative games (such as the pr i soner ’s d i lemma). Secondly, the effect of the hormone is adapted to the context, such that behaviour varies accord-ingly. For example, as men-tioned, individuals with a low 2D:4D tend to be more altru-istic in the dictator game, but if they are first exposed to a violent situation the effect is inverted ( the offers they make are less than average). In the ultimatum game, men with a low 2D:4D ratio tend to reject unfair offers in a neutral context, but i t is more probable that they will accept them in a context of sexual stimulation.

AsymmetryAsymmetry is another mor-phometric variable that has recently attracted a lot of attention. It is a property that reflects the organism’s ability to maintain stable develop-ment despite external factors. The asymmetry that reflects each individual’s developmen-tal instability is called fluctuat-ing asymmetry (FA), and is dif-fe ren t f rom the norma l asymmetry in the population

THE GAMES

Prisoner’s dilemma. The prisoner’s dilemma involves two players who have to decide, without coordination (they are unable to communicate), whether to cooperate with one other or not. The payment each receives depends on both his/her own decision and that made by the other player. Economic theory predicts that players would never cooperate, even though the individual reward would be greater if they did. However, in experiments humans cooperate more often than the traditional approach would predict.

Dictator game. A player is given a sum of money and has the option of sharing it with a second unknown person, without any type of response from the second player. It is a direct way of measuring altruism.

Ultimatum. The ultimatum game is similar, but the second individual can decide whether to accept the share proposed by the first player or reject it. If the second player rejects it, neither player gets any of the money. This game makes it possible to measure tendency of the sec�ond player to punish what he/she considers an unfair offer (at the expense of his own benefit).

Trust Game. In the trust game there are also two players, A and B. At the start, player A is given a sum of money. He or she can then decide whether to share some of if with player B (payment a). This amount is tripled before being given to B, and player B can decide to give something back to A (payment β). Payment a reflects trust (A’s trust in B) and payment β reflects B’s honesty.

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 71

|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE BORDERLANDS 04.3

(so-called directional asym-metry). Low levels of FA indi-cate an ability to maintain sta-ble development despite external stresses.

FA is inversely correlated with variables indicating fitness in organisms, such as longevity and reproductive success. Similarly, a low FA in human beings implies greater suc-cess in terms of various fac-tors such as income, attract-iveness, number of partners, etc. FA has been linked to human behaviour in many dif-ferent ways, as it is to be expected that individuals’ phenotypic quality affects their behaviour. Less philan-thropic individuals (less co-

operative, less altruistic, less trusting) tend to have a lower FA. In principle, their greater phenotypic quality raises their p robab i l i t y o f obta in ing resources for themselves, and therefore reduces their need to establish alliances. This reduced philanthropy has been demonstrated in the pr isoner ’s d i lemma, where individuals with low FA (more symmetrical) are less cooperative, and in the ulti-matum game, where their offers are less generous.

Moreover, in humans, sym-metry i s cons idered an attractive trait. Beauty is undoubtedly a complex vari-able, but it has been shown

that FA determines a large part of it. Beauty affects how individuals are seen, and therefore alters the behaviour of others in terms of the attention they give. This has been analysed by correlations on many levels (average income, likelihood of being given a shorter prison sen-tence, etc.) and, in a con-trolled and quantitative way, through strategic games. In the trust game people trust attractive people more and in the ult imatum game they make bigger offers to people considered attractive (who also turned out to be more symmetrical). That is to say, even in a social context, more symmetrical (more attractive)

people find it easier to obtain resources. This could explain why peop le cons idered attractive behave differently. In the prisoner’s dilemma, the ultimatum game and they dictator game, good-looking people tend to be less philan-thropic (i.e. not cooperating and being less generous). And, conversely, people who did not act philanthropically were retrospectively con-sidered better looking (see Photo 1). However, being attractive does not always imp ly hav ing somewhat unsocial behaviours. In the trust game attractive men playing role B returned larger sums of money (showing themselves to be more hon-est). This may be because although their greater capac-ity to obtain resources does not encourage them to be philanthropic in some of the games, it also means they do not feel the need to act unfairly.

To sum up, the set of data obta ined us ing a mixed approach, combining physio-logy and economics, to ana-lyse how people make deci-sions is helping us understand their biological basis. And, in a broader context, it highlights the advantages of looking for answers by applying tools and knowledge from a seemingly unrelated branch of know-ledge.

Photo 1. The image shows two average faces built using photos of participants in the prisoner’s dilemma game who cooperate (right) and do not cooperate (left). Non-cooperating individuals were more symmetrical and were considered more attractive, as happens with the average generated based on their faces. Source: Illustration courtesy of the author.

72 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

04.4 … IN THE BORDERLANDS ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

David Hilbert’s challengeJust over a century ago, the mathematician David Hilbert (1862-1942) gave a veritable tour de force at the Inter-national Congress of Math-ematicians in Paris. At 9 o’clock on the morning of 8 August 1900, in the main lec-ture theatre of the Faculty of Sciences at the Sorbonne, David Hilbert addressed the following words to his expect-ant audience:

Who would not be happy if he could lift the veil that hides the future to take a look at the progress of our science and the secrets of its further devel-opments in future centuries? In so rich and vast field as mathematics, what will be the objectives and what will be

The millennium problems

Analysing and identifying the challenges facing a discipline is always an exciting exercise and has motivated scientists throughout history. Mathematicians are perhaps the most prone to this kind of prospective exercise and it has a good track record.

Manuel de León

Instituto de Ciencias Matemáticas (CSIC)

the guide for mathematicians’ thought in future times? What will be the new developments and new methods in the new century?

Hilbert’s lecture included a long preamble in which he discussed the nature of math-ematics and its role in the progress of the other sci-ences. Although his list con-tained 23 problems, he only had time to present 10 of them.

At that time, Hilbert, a profes-sor at the University of Göttin-gen, was considered Ger-many’s leading mathematician. In the invitation he received to give the inaugural lecture in Paris, his colleague Herman Kosky (1864-1909), asked

him to take a look at the future. But Hilbert finished preparing his lecture at the last minute, and hesitated over the title, which did not arrive on time for the pro-gramme, with the result that his lecture was finally held on the third day of the Congress.

Hilbert’s effort was in line with the view held by 19th century mathematicians that it was necessary to bring rigour and certainty to the edifice of mathematics. Indeed, in his lecture Hi lbert made an explicit reference to the nature of mathematical problems and the role of mathematics in sci-ence. Hilbert claimed that there was nothing unknowa-ble, no “ignorabimus” in math-ematics. Everything could be

Hilbert claimed that there was no “ignorabimus” in mathematics. Everything could be substantiated and explained in a logical way. If there is a problem, mathematicians will able to find its solution

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 73

|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE BORDERLANDS 04.4

substantiated and explained in a logical way. If there is a problem, mathematicians will be able to find its solution. In Paris, Hilbert faced a different vision, that of the heroes of the age, the French mathem-atician Henri Poincaré, who favoured a more intuit ive approach (Poincaré referred to Cantor’s set theory as “a disease that mathematics will eventually recover from with time.”)

In any event, nobody wanted to be cast out of the paradise of set theory that George Cantor (1845-1918) had cre-ated, and his wonderful con-ception of the different infini-ties. Many years later, on 8 September 1930 in Königs-berg, at the Congress of the Association of German Scien-tists and Physicians, David Hilbert gave a lecture of which four minutes was broadcast by radio. Gilbert ended with his claim: “in opposition to the ignorabimus, we offer our slo-gan: we must know, we will know.”

One landmark in this story is the work of the Italian mathem-at ic ian Giuseppe Peano (1858-1932), who laid the foundations of arithmetic with a collection of axioms and rules. But the question that arose after the Paris Congress was simple but devastating: is this system complete and consist-

ent? Bertrand Russell’s (1872-1970) Barber paradox cast it into doubt: in a town there is a barber who only shaves the towns’ male residents who do not shave themselves: who shaves the barber?

The paradox is connected with the idea of a set of all the sets that are not mem-bers of themselves. Such a set, i f i t exists, wi l l be a member of itself, if and only if it is not a member of itself. A breach had opened up in the apparently solid math-ematical edifice, as no dem-onstration could be reliable if it is based on this logic. One of the most exciting periods in the history of the discipline had begun.

The so-called Hilbert pro-gramme, which aimed to cre-ate a formal system for the mathematics which contained a demonstration of the con-sistency (i.e. not leading to contradictions), completeness (i.e. all truth can be demon-strated) and decidability (i.e. it must be possible to deduce a formula from the axioms by applying the right algorithms), was cut short almost at the same time as he was deliver-ing his war cry against the ignorabimus in 1930. Another mathematical genius, Kurt Gödel (1906-1970) proved that any consistent axiomatic system for arithmetic was

necessarily incomplete, that is to say, there would be true properties which could never be demonstrated. John von Neumann (1903-1957) said after Gödel’s presentation of his findings: “It’s over.”

Nevertheless it was not over. Rather, after those dramatic events other heroic mathem-aticians such as von Neu-mann himself and Alan Turing (1912-1954) based them-selves on this new approach to develop computers and scientific computing as we know them today.

100 years laterSince David Hilbert drew up his famous list, there have been several attempts to update it. Unfortunately, today there are no longer mathema-ticians able to master all areas of mathematics as Hilbert did in his time. Not because there are no longer mathematicians

Unfortunately, today there are no longer mathematicians able to master all areas of mathematics as Hilbert did in his time. Not because there are no longer mathematicians of his stature, but because of the incredible development the discipline has undergone over the last century

Kurt Gödel.

74 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

04.4 … IN THE BORDERLANDS ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

of his stature, but because of the incredible development the discipline has undergone over the last century.

In 1992, the International Mathematical Union (IMU), in its so-called Río de Janeiro Declaration, decided to recall the legendary 1900 Inter-national Congress of Mathemat-icians in Paris and proposed, one century later, that math-ematicians from around the world run activities throughout the centenary year. The declara-tion set out three main aims:

•  To  identify  the biggest mathematical challenges of the 21st century.

•  To proclaim mathematics as a key to development.

•  To improve the image of mathematics through high quality dissemination.

UNESCO also joined the IMU’s declaration, and at its

tiers and Perspectives. Ameri-can Mathematical Society, 2000, 459 pages. This book was published under the aus-pices of the IMU and was part of the activities of World Math-ematical Year 2000. The text comprises 30 articles written by some of the world’s most influential mathematicians; in fact, 15 of them are written by various f ields medal l ists, including K. F. Roth (winner of the 1958 Fields Medal) and W. T. Gowers (Fields Medal win-ner in 1998). Certain articles identify some of the most important problems for math-ematicians in the 21st-cen-tury, others review various problems set out by Hilbert and some are articles which explore the motivations of mathematicians at that time.

The millennium problemsWithout doubt, however, the initiative that has had the most powerful impact on public opinion –and is starting to have an effect in the math-ematics community– is the list of so-called millennium prob-lems.

This initiative was launched by the Clay Mathematics Institute (CMI), a private non-profit making foundation, based in Cambridge, Massachu-setts. The CMI was created in 1998 on the initiative of the Boston businessman Landon T. Clay and his wife

p lenary meet ing on 11 N o v e m b e r 1 9 9 7 , t h e UNESCO General Conference followed the recommenda-tions of Commission III and approved draft resolution 29 C/DR126 on World Math-ematical Year 2000 (WMY 2000), emphasising the edu-cational aspects of mathe-matics.

The celebration of WMY in 2000 resulted in the organisa-t ion of numerous events around the world, run by national committees, and also gave rise to various initiatives aiming to identify the math-ematical challenges of the 21st-century.

One such initiative is the book by Björn Engquist, Wilfried Schmidt: Mathematics Unlim-i ted-2001 and Beyond . Springer-Verlag, Berlin, 2001, 1238 + XVI pages, to which around 90 mathematicians from around the world con-tributed their knowledge of their respective fields, in the form of 58 articles and five interviews. It is not an encyclo-paedia or a work of synthe-sis but obviously reading it gives a general overview of the state of the discipline at that time.

A n o t h e r e x t r a o r d i n a r y attempt, closer to the goals of WMY 2000, was that of V. I. Arnold: Mathematics: Fron-

Lavinia D. Clay. The institute’s goals are:

•  To increase and dissem-inate mathematical knowl-edge.

•  To educate mathemat-icians and other scientists about new discoveries in the field of mathematics.

•  To encourage gifted stu-dents to pursue mathemat-ical careers.

•  To  recognise  extraor-dinary achievements and advances in mathematical research.

The first president of the CMI was Arthur Jaffe, a renowned mathematician at the Univer-sity of Harvard. The institute organises numerous activities including conferences, lec-tures for the general public and seminars.

The institute wanted to cele-brate mathematics in the new millennium by establishing seven millennium prizes. The aim was to identify the most difficult unresolved problems and at the same time highlight that mathematics is a living subject, whose frontiers are still open, and point to the importance of working to find solutions to the problems of historical significance. These seven problems were chosen

David Hilbert.

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 75

|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE BORDERLANDS 04.4

by the Institute’s scientific committee after much deliber-ation. There is a prize of $1 million for solving each of the problems.

The mil lennium problems were presented in Paris on 24 May 2003 at the Collège de France, with lectures by Tim-othy Gowers, Michael Atiyah and John Tate.

The list is the following:

The Birch and Swinnerton-Dyer conjecture One of the best-known math-ematical problems is the search for integer solutions to equations of the type

x2 + y2 = z2

Solving equations like this can be very difficult, and it has been shown that there is no general method for solv-ing such equations for inte-ge r numbers ( t h i s was Hi lbert’s tenth problem). There are partial solutions, however. And the Birch and Swinnerton-Dyer conjecture holds that in the realm of algebraic varieties the size of the group of rational points is related to the behaviour of a zeta function ζ(s) near the point s=1.

The Hodge conjectureMathematicians study compli-cated objects by approximat-

ing with simpler geometrical blocks that are glued together appropriately so as to be able to classify them. This is one of the traditional tasks of the discipline. The Hodge conjec-ture holds that for the math-ematical objects known as pro-jective algebraic varieties, the pieces called Hodge cycles are actually linear combin-ations of simple geometrical objects cal led a lgebra ic cycles.

The Navier-Stokes equa-tionThe Navier-Stokes equations date back to the 19th century and we all feel their effects when we travel by plane, par-ticularly if we meet turbulence during the flight. Mathemat-icians want to understand them and to do so they need to know more about their solution. However, despite

considerable efforts over dec-ades, they are still relatively poorly understood.

The P versus NP problem One of the most important problems in computer science is determining whether ques-tions exist whose answer can be checked easily, but which require an impossibly long time to solve by any direct procedure (a lgor i thm). P problems are those who solu-tion is easy to find, and NP problems are those where it is easy to check if a given poten-tial solution really is one. This problem was formulated independently by Stephen Cook and Leonid Levin in 1971.

The Poincaré conjectureA rubber band on the sur-face of a sphere can be made smaller and smaller

until it shrinks to a point. But if we imagine the band has been stretched in the right direction around the surface of a doughnut (which in mathematics is called torus) this will be impossible to do. This is the origin of algebraic topology, which associates algebraic objects with topo-logical objects, and allows them to be studied and clas-sified. A sphere is said to be “simply connected” but a torus is not. Poincaré won-dered a century ago (1904) if what he was able to demon-strate in two dimensions was valid in higher dimensions, and part icular ly in three dimensions. The solution came a hundred years later, when Gregoriy Perelman publ ished two art icles in 2002 and 2003 announcing that he had found the answer using Richard Hamilton’s

Manuel de León

Is a CSIC research professor and director of the Instituto de Ciencias Matemáti-cas (Institute for the Mathematical Sciences ). He is a corresponding member of the Real Academia de Ciencias.

His research focuses on differential geometry and its applications in mechanics, and he has published over 200 papers and 3 monographs. He belongs to the edi-torial boards of several scientific journals, and he is a founder and director of Jour-nal of Geometric Mechanics. He is also involved in a range of outreach activities.He is a past re-founder and vice-president of the Real Sociedad Matemática Es-pañola (RSME), founder and director of the RSME’s journal La Gaceta, re-found-er and president of the Comité Español de Matemáticas. He chaired the Inter-national Congress of Mathematicians ICM-Madrid 2006 and is a member of the executive committee of the International Mathematical Union.

He has been mathematics coordinator at the ANEP, a member of the experimen-tal sciences advisory committee on evaluation and foresight, and a member of the CSIC area committee on Physical Science and Technology. He is currently a member of the PESC Core Group (European Science Foundation).Manuel de León.

76 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

04.4 … IN THE BORDERLANDS ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

pioneering work on Ricci flow. In fact, what Perelman proved was a more general result which contained the Poincaré conjecture as a particular case, having also proven Wil-liam Thurston’s Geometrisa-t ion Conjecture, thereby solving one of the seven mil-lennium problems. Perelman was awarded the Fields medal at the ICM in Madrid in 2006 for his extraordinary achieve-ment, although famously, he turned down both the medal and the million-dollar cheque.

The Riemann hypothesisThe prime numbers can be considered as the building blocks from which all the other integer numbers are formed. Euclides proved that they were infinite, but the pattern by which they are d i s t r i bu ted rema ins an unsolved problem. Georg Friedrich Bernhard Riemann (1826-1866) observed that their frequency was related to the so-called Riemann zeta function.

ζ(s) = 1 + 1/2s + 1/3s + 1/4s + ...

The Reimann hypothesis asserts that the interesting solutions of the equation

ζ(s) = 0

(the zeros of the function) lie on a certain vertical straight

Spain and the frontiers of MathematicsIn David Hilbert’s day, math-ematical research in Spain was practically non-existent, despite certain individual efforts, and the fact that Spain had been home to brilliant Arab and Jew-ish mathematicians during the Muslim domination, or that Spain was a pioneer in creating a Royal Academy of Mathemat-ics in 1582 under Philip II.

The “regenerationist” drive in the early 20th century led to the creation in 1915 of the L a b o r a t o r i o S e m i n a r i o Matemático under the aegis of the Junta de Ampliación de Estudios, the precursor of today’s CSIC, but this blos-soming of mathematics was cut short by the Civil War. In 1980 the number of articles in ISI journals published by Spanish mathematicians was

line. This is undoubtedly one of the most difficult of the problems mathematicians want to solve today.

Yang-Mills and Mass GapThe elementary particles of physics are described geo-metrically using the theory of Chen-Ning Yang and Robert L. Mills. Unlike the case of electromagnetic forces, the fields of nuclear interaction have to have mass, which is expressed by saying that there is a mass gap. But in the classical Yang-Mills theory, particles have no mass. Thus, the problem is to demonstrate the quantum Yang-Mills theory and the existence of a mass gap in a mathematically rigor-ous way. Achiev ing th is will undoubtedly require the introduction of fundamental new ideas in physics and mathematics.

0.3% of the total, compared with 5% today. In recent years Spanish mathematics have developed spectacularly, sym-bolised by the fact that the ICM was held in Madrid for the first time ever in 2006.

The books mentioned above did not include any Spanish authors, and even the number of Spaniards they cited was tiny. However, today, the situ-ation would be different and Spanish mathematicians have earned recognition for their work on problems such as generalised Sidon sets (Javier Cilleruelo and Carlos Vinuesa), and conjectures by John Nash (Javier Fernández de Bobadilla and María Pe) and V.I. Arnold (Alberto Enciso and Daniel Peralta Salas),

We can say, then, that not only are we able to understand the Millennium problems, but we are also able to work on them, and that is a hugely important qualitative change.

Other frontiersThe problems alluded to here refer mainly to the internal frontiers of mathematics, but the discipline is facing numer-ous challenges this century from other fields of science, industry, and technological development.

An indisputable paradigm that has emerged in recent

Henri Poincaré.

Georg Friedrich Bernhard Riemann.

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 77

|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN THE BORDERLANDS 04.4

decades is the possibility of handling large quantities of data, made possible by the exponential growth in the cal-culating power of today’s computers. Identifying pat-terns in the myriad data pro-vided by astronomy, seismol-ogy, genetics and the LHC’s experiments requires math-

ematical tools, and probably some new and more power-ful ones. Moreover, the con-struction of new, ever more e laborate, mathemat ica l models is the key to solving the big challenges our society faces such as sustainable deve lopment or c l imate change.

Mathematics looks like an abstract construct that could exist outside the physical world. However, its frontiers sometimes get ahead of its potential applications, some t imes accompany them and sometimes lag behind, but without th is discipline we would not be

able to understand this uni-verse . . .understand th is universe, which as Galileo Galilei said, is written in math-ematical language, which as Galileo Galilei said, is written in mathematical language. This duality is what gives it its greatness and makes i t attractive.

DAVID HILBERT’S 23 PROBLEMS

1. Cantor’s problem of the cardinal number of the continuum. What is the cardinal number of the continuum?

2. The compatibility of the arithmetical axioms. Are the arithmetical axioms compatible?

3. The equality of the volume of two tetrahedra with the same base and same height.

4. The problem of the shortest distance between two points. Is a straight line the shortest distance between two points on any surface in any geometry?

5. Establishing the concept of the �ie group, or con�Establishing the concept of the �ie group, or con�tinuous group of transformations, without assuming the differentiability of the functions that define the group.

6. Axiomatisation of physics. Is it possible to create a body of axioms for physics?

7. The irrationality and transcendence of certain numbers.

8. The problem of the distribution of prime numbers.

9. Demonstration of the generalisation of the reciprocity theorem of number theory.

10. Establishing effective methods of solving Diophantine equations.

11. Quadratic forms with arbitrary algebraic coefficients.

12. Extension of the Kronecker theorem on Abelian bodies to any domain of algebraic rationality.

13. Impossibility of solving the seventh degree general equation by means of functions with only two argu�ments.

14. Proof of the finite condition of certain complete systems of functions.

15. Rigorous basis of Schubert enumerative calculation or algebraic geometry.

16. Problem of the topology of algebraic curves and sur�Problem of the topology of algebraic curves and sur�faces.

17. Expression of forms defined by sums of squares.

18. Construction of the space of congruent polyhe�Construction of the space of congruent polyhe�drons.

19. Are the solutions to the regular problems of calculation of variations always analytic?

20. The general problem of the Dirichlet boundary condi�The general problem of the Dirichlet boundary condi�tions.

21. Demonstration of the existence of �uchsian linear dif�Demonstration of the existence of �uchsian linear dif�ferential equations given their singular points and mon�odromic group.

22. Uniformity of analytic relationships by means of automorphic functions. It is always possible to uni�formise any algebraic relationship between two variables by means of single variable automorphic functions.

23. Extension of the methods of calculation of variations.

05… in studies into

society and culture

80 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

05.1 … IN STUDIES INTO SOCIETY AND CULTURE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

The coordination unit of the CSIC’s Humanities and Social Sciences

Area was at one time thinking o f ask ing i ts groups o f researchers a simple ques-tion, similar to that asked in

Frontier research in the Humanities and Social Sciences

Javier Moscoso

Consejo Superior de Investigaciones Científicas (CSIC)

The CSIC’s new Centre for the Humanities and Social Sciences in Madrid has brought seven different research institutes, mainly in the humanities, under the same roof. Opened two years ago, the centre represents a commitment to the future on in terms of quality research, based on cooperation, with ambitious goals, and operating in a European context. With over 20 research lines, many of its teams clearly work at the frontiers of knowledge, with high levels of internationalisation and tangible results in terms of the whole gamut of indicators of excellence.

other areas of the Spanish National Research Council (CSIC), to identify the five or six topics they felt repre-sented the biggest advances in our knowledge over the last 15 or 20 years. We could

have asked the same ques-tion about the top five or six authors or books. The diffi-culty was not the lack of ideas, but the need to choose some l i nes o f research instead of others; an exercise

whose result was perhaps more the consternation of the researchers involved than the admiration of their peers. However, it is no secret to the area’s researchers that meth-odological changes are afoot,

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 81

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN STUDIES INTO SOCIETY AND CULTURE 05.1

with shifts away from old ways of looking at topics, and new issues and approaches emerging, either as a result of demand from society or pres-sure from within. The so-called “linguistic turn”, “visual turn” or “affective turn” are prime examples of methodo-logical changes that are sometimes backed by entire research groups, and whose effect have made themselves felt far beyond the limited geographic and disciplinary boundaries in which they were first conceived.

By its very nature, the Fun-dac ión Gene ra l CS IC ’s question in this journal, as to

the meaning of so-cal led “frontier research” in the Humanities and Social Sci-ences, brings with it a simi-lar difficulty to that faced by the CSIC’s coordinators when they asked the i r researchers to select a limit- ed number of key topics. First of all, it calls for clarifi-cation at the outset of what is meant by the term “fron-tier research.” This is a term which wavers between two or three meanings that are not necessar i ly mutual ly exclusive. For many scien-tists, “frontier” research is that which takes place at the l imits of knowledge, and which questions either the

ex ist ing methodology or stock of knowledge, for example, by formulat ing specu la t i ve hypotheses whose even partial corrob-oration would be something of a surprise, given their prima facie improbabil ity. From th is point of v iew, Frontier research resembles the search for improbable hypotheses, as the opposite –the trivial confirmation of established ideas– would never lead to any spectacu-lar discoveries. For many other scientists, however, the word “frontier” does not necessarily refer to what is at the limit of the probable on the basis of our prior

Javier Moscoso

Is a research professor in the History of the Philosophy of Science at the CSIC’s Instituto de Filosofía (Institute of Philosophy). With a PhD in Philoso-phy from the Madrid Autonomous University, he completed his studies at the Centre Alexandre Koyré in Paris, the Wellcome Institute of the History of Medicine in London (where he was a guest researcher for four years) and the Department of History of Science at the University of Harvard, where he re-mained for a further two years.

He has also worked as a contract researcher at the Max-Planck Institute for the History of Science in Berlin (1997). He is currently the coordinator of the Human-ities and Social Sciences Area of the Spanish National Research Council (CSIC).

As a researcher he has worked in three fields. The history of the life sciences, the history of singularities and the development of the science of teratology, and the history of pain. Of these latter projects, as well as various monographs written alone, compilations and publication of his journals in various languages, he has also organised a number of exhibitions.

His latest book, Una historia cultural del dolor (A cultural history of pain) is due to be published in English by Macmillan-Palgrave later this year and in Spanish by the publisher Taurus.

It is clear that not all excellent science is “frontier” science, but neither is all “frontier” research necessarily good science

Javier Moscoso.

82 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

05.1 … IN STUDIES INTO SOCIETY AND CULTURE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

knowledge, but research which moves at the bound-aries of the disciplinary frame-works , the i r theore t ica l structures and their depart-ments and social institu-tions. The forms of speci-ation of scientific knowledge, the historical rise of numer-ous discipl ines, is due in part to these inter-territorial pairings, where some sci-ences (together with the sci-entists practising them, the laboratories in which they are pursued, and the budgets paying for them) can draw upon others, incorporate other scientists’ work or use their results for their own benefit. Last, but not least, for many scientific policy-makers, frontier research is synonymous with qual i ty research; that is to say, syn-onymous wi th sc ient i f ic excellence. Serious misun-derstandings somet imes arise when it is understood in this latter sense, as it often happens that the way in which excellent is meas-ured (for example, in terms of impact factors) is con-fused with the purpose or social relevance of research (for example, in terms of transfer mechanisms) and is understood in a monolithic and inappropriately dimen-sioned way, such as when the only type of transfer pos-sible is considered to be technology transfer and is

expressed as the number of patents.

In the case of the humanities a n d s o c i a l s c i e n c e s , research may be “frontier” in all three senses described, a l though i t needs to be made clear from the outset that not all excellent science is “frontier” nor is all “fron-tier” research, for this reason alone, necessarily good sci-ence. Although it is often the case that the best science is at the frontier, it would be to commit the fallacy of affirm-ing the consequent to con-clude that working at the frontier of science is a suffi-cient condition for science to be excellent. In a spec-trum as wide (and artificial) as the body of knowledge and practices that fall under the generic heading of the humanities and social sci-ences –which range from experimental sciences (like psychology, archaeology or linguistics), to quantitative sciences (such as sociology, demography or geography), and those with more qualita-tive methodologies (such as history, l i terary theory or art)– movements also take place, of course, that are associated with the appear-ance of new ob jects o f study, with the introduction of new research techniques or the adoption of new epis-temic values. At the same

time, some of these lines of “frontier” research are also at the frontier because they i n v o l v e c o l l a b o r a t i o n between different disciplines and fields. It is not unusual, for example, to find research groups bringing together people with different aca-demic backgrounds collab-orating on solving common problems. On some occa-sions, these groups take on more institutionalised forms, such as in te r-un ivers i ty departments or even, such as the case of the CSIC’s recently created Institute of Heritage Sciences (Insti-tuto de Ciencias del Patri-monio, INCIPIT) in Santiago C o m p o s t e l a , l e a d t o research centres. Even when researchers from other areas of research that many people consider “more scientific” –such as organic chemistry, mathematics, physics or computation– participate in some of these synergies between disciplinary niches, the research is not “more” or “less” frontier-oriented because it is associated with methodologies or epistemic values intrinsic to the exper-imenta l sc iences . Con-versely, research on global change, wh ich inc ludes experts in archaeobotany or prehistory on its teams is not better or worse for this fact. The quality of research is always determined by the

As the philosopher Martha Nusbaum has pointed out, it is only totalitarian regimes that shun research in the humanities. By contrast, critical research in these areas, and their institutional health, are a measure of the strength of the democracy in the countries in which it is pursued

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 83

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN STUDIES INTO SOCIETY AND CULTURE 05.1

qual i ty and impact of i ts results. At this point it is worth dispelling the widely held fallacy that, because the evaluation criteria for research in the humanities and socia l sc iences are apparently less rigid or less objective than in other fields of knowledge, their contribu-tion to quality research will be more questionable. As we shall see, nothing could be further from the truth.

Frontier research in the Humanities and Social Sci-encesIn 2005 and 2006, the Direct-orate General for Research at the Ministry of Education and Science (MEC) carried out a study to compare the results of evaluations of the projects included in the Nat ional Research Plan conducted by the National Evaluation and Foresight Agency, ANEP (Agencia Nacional de Eval-uación y Prospectiva), with those of the ministry’s own panels of experts. Contrary to wha t m igh t have been expected, the degree of agreement between both groups of evaluators, who moreover used different meth-odologies from among the set of internationally accepted procedures, exceeded all expectations and threw up a very interesting conclusion: no mechanical evaluation tool can substitute for the work of

evaluators who, often before seeing the impact factors or other bibliometric measures, already know what good research looks like; a fact borne out by the history of science.

What philosophers call the “myth of mechanical object-ivity” has made itself felt in research in the humanities and social sciences, and has also affected the form and content of its interdiscipli-nary research. On the one hand, there is the wide-spread prejudice that the most appropriate form of research in this area relies on cooperating with other sciences applying quantita-tive or experimental method-ologies (which is absolutely false, as there is also experi-mental research in various f ie lds of the human sci-ences, as fields archaeology, l inguist ics or psychology attest). On the other hand, there is a growing suspicion that, in view of the absence of quantitative procedures and methods –which is also utterly untrue– no claims can perhaps be made for this f i e l d ’s i n t e rd i sc i p l i n a r y research or its mobilisation of “improbable” hypotheses. These two conclusions are not just incorrect, but are the embodiment of a more widespread preconception about the role of the human-

ities and social sciences in the contex t o f na t iona l research policies. As in other similar cases, this prejudice a lso permeates soc iety, where it feeds on an ignor-ance of the history and phi-losophy of science, which is to say a lack of the very know ledge tha t ce r ta in branches of the humanities can and do provide.

Once aboard the train of progress, some people are only interested in how much effort the journey will need or the laws of motion affecting the train, without stopping to t h i nk o f t he d i rec t i on , number and origin of the passengers, or the destin-ation. As the philosopher M a r t h a N u s b a u m h a s pointed out, it is only totali-tar ian regimes that shun research in the humanities. By contrast, critical research in these areas, and their institutional health, are a measure of the strength of the democracy in the coun-tries in which it takes place. However, it does not take a renowned political philoso-pher to reach a conclusion which can be drawn simply by reading a newspaper carefully. Here, as elsewhere, the paradox of ignorance is that the practitioner is deter-mined to ignore precisely what could be the remedy and cure.

Over the last ten or fifteen years, the humanities and social sciences have positioned themselves around new research topics, which inevitably have led them to develop new forms of interdisciplinary cooperation

84 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

05.1 … IN STUDIES INTO SOCIETY AND CULTURE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

Quality researchOver the last ten or fifteen years, the humanities and social sciences have pos-itioned themselves around new research topics, which have inevitably led them to develop new forms of inter-disciplinary cooperation. In some instances, such as the case of the CSIC’s new Cen-tre for the Humanities and Social Sciences in Madrid ( C e n t r o d e C i e n c i a s Humanas y Sociales, CCHS), these shifts have received institutional backing. For the sponsors of this project, which has brought together seven di fferent research institutes, mostly from the humanities, the centre, which opened two years ago, rep-resents a commitment to the fu ture based on qua l i ty research, set in a European context, taking a cooperative approach to achieving its ambitious goals. The atom-isation and fragmentation of research groups, in conjunc-tion with meagre and uneven funding, has always been one of the biggest obstacles to Spanish science. This new centre offers the most recent counterexample, but by no means the only one. With over 20 research lines, many of its teams clearly work at the frontiers of knowledge, with high levels of interna-tionalisation and tangible results in terms of the whole

range of indicators of excel-lence: whether measured by the impact of their publica-tions, sources and amounts of funding or transfer of their results. In some cases, their research focuses on prob-lems with a considerable social impact and growing polit ical interest, such as ageing or global change. Others take place in cooper-ation with similar institutions, such as Programa Conviven-cia a joint initiative by the CSIC and the Max Planck Society studying the history of forms of cultural integra-tion in the late Middle Ages. In other cases, such as stud-ies of archaeology and social processes, the research teams include researchers from history, along with others from philology or art, and their results add their enor-mous social impact to their scientific excellence.

At the CCHS, as in many o t h e r u n i v e r s i t i e s a n d research centres in Spain and abroad, frontier research in the humanities and social sci-ences is associated with new objects of study, such as the growing interest in cognitive processes, which normally i n c l u d e c o l l a b o r a t i o n s between psychologists, lin-guists and philosophers of m ind. There i s a l so an increasing focus on studies of the emotions, which mobilise

cultural historians, theoreti-cians of art, experts in visual culture, sociologists and anthropologists.

By its very nature, “frontier” resea rch i n any o f t he senses described, cannot guarantee a positive return f rom eve ry i nves tment , although it is the best guar-antee of success overal l. This means that, contrary to what happens in more con-servative forms of research, where the results are pre-dictable and even inconse-quential, frontier research must accept partial failure as an essential part of its day-to-day work. In this regard, the pressure on research groups in Spain and abroad to do quality science that has ambit ious objectives and eschews predictable results is not being backed up w i th commensu ra te resources from government to promote knowledge and science. Nevertheless, the history of science has shown over and over again that public or private investment in research is only recouped in the medium to long term, and that the path is a wind-ing one, punctuated with failed attempts. In the long term, the pursuit of quick returns from research has, historically, led only to intel-lectual stagnation and eco-nomic paralysis.

By its very nature, “frontier” research in any of the senses described, cannot guarantee a positive return from every investment, although it is the best guarantee of success overall

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 85

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN STUDIES INTO SOCIETY AND CULTURE 05.2

The frontiers of know-ledge are usually con-cerned with problems

that people find extremely hard to grasp. At first sight, therefore, it is somewhat surprising that something that even a small child can do could be a subject for frontier research. However, the quest ion of how we come to speak is extraor-dinarily difficult to answer. Language is a particularly ha rd p rob lem because , unlike many other aspects of cognition, such as visual o r aud i to ry percept ion , memory, etc. there are no animals we can study that

Language: learning and useNùria Sebastián centres her article on a series of questions concerning the study of language acquisition and processing: initial capabilities and the origins of language, learning two languages, and language as a social tool. She also debunks some popular myths and points to directions for future research.

Nùria Sebastián

Universitat Pompeu Fabra

are able to learn and use a similar system. Obviously, for ethical reasons, research in the field is limited to non-invasive methods. In addi-tion, researchers face the extra problem that every-body seems to think they know all about the subject, a l t h o u g h i g n o r a n c e o f the actual scientific know-ledge on the topic is almost universal.

Initial capabilities: the ori-gins of languageIt has often been considered useless to talk to babies dur-ing their first months of life because they do not under-

stand what is said to them and the i r bra in has not matured enough to “absorb” language. However, nothing could be further than the truth. At birth babies are already able to distinguish some languages from others, such as English from Span-ish, or Dutch from Japanese (it is not until 4-5 months that they can distinguish Italian from Spanish or English from Dutch, though). They can even do this if we modify the way sentences are spoken so that they contain the same phonemes (sounds), so as to ensure that the babies are not using information such as

the use of a particularly dis-tinctive sound which is only heard in one of the languages (for example, the sound of the letter J in Spanish). How-ever, they cannot identify lan-guages if we turn the sen-tences around and play them backwards. At birth babies can also distinguish the phon-emes of all the world’s lan-guages, even if they had never heard them before because their parents’ lan-guage doesn’t use them (for example, Japanese new-borns can distinguish /r/ from /l/). These abilities are shared with other species. Marmo-sets and rats can distinguish

86 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

05.2 … IN STUDIES INTO SOCIETY AND CULTURE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

Dutch from Japanese under exactly the same conditions as human babies. However, a t e ight months human babies can make certain cal-culations of the probabilities of syllables in speech that rats cannot. One currently act ive area o f research focuses on studies compar-ing the abilities of human babies in the first few months of life with those of other spe-cies. It is likely that this type of study will give us more insights into the evolutionary or igins of language than (futile) efforts to try to teach language to chimpanzees and bonobos.

During (approximately) the first six months of life, human babies’ language-processing capabilities do not seem to be strongly influenced by the language in their environment (although at birth babies pre-fer to listen to the language

spoken by their parents, sug-gesting intra-uterine influ-ence). At around six months, a significant change takes place, and babies begin to tune in to the language they hear: for example, they lose the abil ity to discriminate sounds to which they are not exposed and improve their ability to perceive the phon-emes in their language. At this point they begin to rec-ognise certain words (their name, for example) and the brain’s response to language is clearly lateralised (in most adults, the brain structures of the left hemisphere show stronger activation patterns than those on the right dur-ing language processing). Studying the mechanisms underlying language acquisi-tion in the early months of life i s a h igh ly act i ve f ie ld , although the study tech-niques are limited, as regis-tering brain activity (using

both neuroimaging tech-niques and EEG readings) requires the subject to be immobilised... And it is not easy to get a 10 month old baby to stay still for a few minutes! Moreover, the study techn iques have to be adapted constantly as infants mature (see Figure 1). Inves-tigating how human beings are able to learn language, with its extraordinarily com-plex formal properties, in such a short period of time, is a current challenge.

Learning two languagesIt is well known that learning a second language in childhood often gives better results than doing so in adulthood. There are several important ques-tions about language acquisi-tion and processing surround-ing this issue.

The most extreme case of early learning is that in which

Nùria Sebastián

Has a PhD in Experimental Psychology from the University of Barcelona (1986). She undertook her post-doctoral training at the Max Planck Society, the CNRS Laboratoire des Sciences Cognitive et Psycholin-guistique (Paris) and the Institute of Cognitive Neuroscience at UCL (London). She became a lecturer in 1988 and professor in 2002 in the faculty of psychology (University of Barcelona). In 2009 she moved to the Pompeu Fabra University (Barcelona). In 2001 and she was awarded the James S. McDonnell Foun-dation prize (“Bridging Mind, Brain and Behavior” Program) and 2009 she was awarded the ICREA prize. She was a member of the OECD’s “Brain and Learning” advisory group from 2002 to 2006. She current-ly leads the SAP research group (language acquisition and processing) at Pompeu Fabra University and it is the coordinator of the Consolider-Ingenio 2010 research consortium for bilingualism and cognitive neu-roscience research (BRAINGLOT).

She has published over 70 papers in international scientific journals, and is associate editor of Develop-mental Science, (2005).

At birth babies are already able to distinguish some languages from others, such as English from Spanish, or Dutch from Japanese, although it is not until 4-5 months that they can distinguish Italian from Spanish or English from Dutch

Nùria Sebastián.

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 87

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN STUDIES INTO SOCIETY AND CULTURE 05.2

the infant is born into a multi-l ingual fami ly. How long does it take for babies to perceive that there is more than one language in their environment? As I said, at b i r th babies are able to d e t e c t t h e d i f f e re n c e s between some languages, so i f the i r env i ronment includes languages that are sufficiently different they will notice from birth. But if they a re bo r n hea r i ng more closely related languages (such as Catalan and Span-ish) they will take about 4-5 months: at this stage babies are able to discriminate lan-guages such as English and Dutch or Spanish, Italian and Ca ta l an . Resea rch has shown that babies growing up in b i l i ngua l se t t ings develop in an equivalent way

to children growing up in a monolingual setting, regard-less of how similar or differ-ent the languages are. How-ever, equivalent does not mean ident ical. Bi l ingual bab ies deve lop spec i f ic processing strategies that allow them to acquire the two languages. An important part of the research in this field aims to elucidate what these specific strategies are and what impact they have on cognitive development and function (various recent studies show that bilingual indiv iduals have a s l ight advantage over monolingual individuals when reacting to conf l ic t ing s t imu l i : they respond a few milliseconds earlier and activate certain cortical areas more effec-tively).

guage acquisition is that the different types of knowledge involved (phonemes, morph-ology, syntax, etc.) recruit d i fferent brain networks, which require different nuclei to operate. In the most extreme case of early learn-ing, in which a baby is born into a multilingual family, in the early stages substantial parts of the brain remain functionally inoperative as the brain does not mature uni-formly. It is therefore in an optimal state to learn the dif-ferent subsystems of lan-guage at different t imes. There are aspects to which there do not seem to be time limits (we can learn words throughout our lives), but for others there seem to be advantageous periods which are relatively limited in time. Recall that by the sixth month of life, humans begin to tune in to the perception of phon-emes and the properties of the language in their environ-ment. This “advantageous period” for the acquisition of phonemes occurs between the ages of six and twelve months. The start and end of the advantageous periods for which animal models exist, such as vision, suggest that certain transcription factors play a fundamental role (one of the proteins proposed is Oxt2). Molecular models for the start and end of critical periods in vision offer an initial

Photo 1. Research with infants uses habituation-dishabituation techniques. Certain types of stimuli can be self-administered by babies’ sucking on a teat or by a system that tracks their gaze. After a time, babies get bored and the response rate drops. When this reaches a threshold they are presented with new stimuli of either the same category or a different category. Only if they notice the change in category does the response increase. Source: www.infantstudies.psych.ubc.ca

The most extreme case of early learning is when a baby is born into a multilingual family

Most people learn a second language later in life, how-ever. One of the most contro-versial concepts in this area is whether or not there is what is called a “critical period.” The widely held view is that learning a language before the end of the critical period enables perfect acquisition but that when it is learned later, this is impossible (the age varies from seven years to adolescence). The issue is that rather than critical periods it makes more sense in biolog-ical terms to talk of “advanta-geous periods”: During these periods, stimulation has a maximum effect; outside of them, the same amount of stimulation is less effective. The problem of determining whether or not there is an advantageous period for lan-

88 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

05.2 … IN STUDIES INTO SOCIETY AND CULTURE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

framework in which to under-stand the dynamics of brain plasticity in human beings. With respect to initial lan-guage learning, one issue that needs to be resolved is that of establishing a relation-ship between environmental factors (particularly rich or poor exposures, certain types of medication, nutrition, etc.) with the specific dynamics of the language learning proc-ess. Clearly, establ ishing these links for a large part of l anguage and l ingu is t ic knowledge is a bridge too far (or indeed perhaps futile). At present it is impossible even to formulate hypotheses about the underlying molecu-lar basis of the acquisition of, for example, subject verb agreement in a sentence. But bridges already exist (or at least, the scaffolding) for some of the knowledge that babies acquire in the first few months of life.

A study of the initial mech-anisms involved in the learn-ing (and use) of language may provide an excellent model for the influence of e p i g e n e t i c a s p e c t s o n healthy development. Often the impact of initial experi-ence on development is based on models of aberrant exposure. Numerous studies have shown that hostile envir-onments in the initial peri-ods of development are

good predictors of physical or psychiatric illness. How-ever, the impacts of varia-tions that lie within the nor-mal range are more difficult to study. The natural variabil-ity that occurs in languages offers models of non-patho-logical variation. For example, languages vary significantly in te rms of the number of phonemes (Spanish has just f ive vowels, whereas Dutch has almost 20). Lin-guistics provides us with the-oretical models of the formal differences between lan-guages, which tend to be reflected in differences in processing. Neuroimaging studies show that not only does the brain differ mor-phologically depending on language exposure ( fo r example, there are differ-ences in the size of auditory perception areas between monolingual and bil ingual subjects), but also in specific patterns of neuronal activa-t ion. The specif ic mech-anisms that are necessary to understand a “head initial” language like Spanish are not exact ly the same as those required a “head final” language like Basque. There a re d i f f e re n c e s i n t h e processing and the type of computat ions the bra in needs to carry out to under-stand two sentences that mean the same th ing in Spanish and Basque. In the

sentence “la mujer que ha escr i to este l ibro” ( “ the woman who wro te th i s book”), at the start of the sentence we already know that it is about a woman, to which the rest of the phrase adds details. In Basque, this sentence would be “ l iburu hau idatzi du-en emakume-a” (where the word order is something like “book this written that woman-the”), where first we are given the details, and the information about what these detai ls refer to comes at the end of the sentence. These differ-ences imply different storage and processing needs in each language. Or, to give a more ext reme example , compare the Spanish “volv-eremos” with i ts Engl ish equivalent, “we will come back.” Both sentences mean the same thing, but in the case of the Spanish sen-tence, the subject (“we”) and be future tense have to be deduced f rom spec i f i c sequences within the parts of the word; whereas in Eng-lish, each part of the mean-ing is represented by a sep-arate word. To learn a new language, we have two mod-ify processing strategies that are deeply entrenched in the brain, as they are learned very early in life and continu-ally practised. Most human beings achieve this with rela-tive success on the basis of

effort and years of practice. However, there are some indiv iduals who seem to have a natural talented for language acquisition or who are said to have a “gift” for languages.

In recent years there has been a proliferation of stud-ies relating the performance of specific cognitive tasks with certain genetic vari-ations. Given the complexity involved in our understand-ing of language (Figure 1) it is not surprising that it is par-ticularly difficult to identify the genes that play a signifi-cant role in language learn-ing and use. Although vari-ous genes associated with the appearance of a number of language pathologies have been identif ied (the most famous is Fox P2, but CNTNAP2, ATP2C2, CMIP or lysosomal enzymes have also been targeted), these studies are based on the analysis of pathologies and diagnoses, and progress needs to be made on incorp-orating recent knowledge about language learning and function.

Language as a social toolOne area that is currently gen-erating a lot of interest is the interplay between language processing and social phe-nomena. Several studies on adults and five-year-old chil-

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 89

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN STUDIES INTO SOCIETY AND CULTURE 05.2

dren have shown that people prefer unaccented speakers to people who speak with a foreign accent. Studies con-ducted in the US show that, if the choice is between ethnic-ity (race) or accent, children (and adults) preferred a per-son of a different ethnicity who speaks with their accent over a person of the same ethnicity but with a foreign accent. The preference for people who speak without an accent is observed in five month old babies. Babies prefer to watch (they hold their gaze for longer) silent videos of people who they have previously heard speak without an accent rather than videos of people they have previously heard speaking with a foreign accent.

And the influence of social issues goes beyond mere preference. When we hear someone we consider to be higher up the social hierarchy than ourselves, even if it is because they play a video game better than us, our per-ception of what they say causes more powerful corti-cal activations than when we hear someone we consider to be of an inferior social rank. This is true not only at the level of the overall mean-ing of sentences, but also on the level of recognition of iso-lated words: we perceive a word better if it is spoken by

someone we consider to be of superior rank. The impact of social hierarchy on brain function is a very recent field and its results should be taken into account when making decisions in various areas, particularly in educa-tion.

Concluding remarksThe study of language (and cognition in general) requires cooperation between psych-ologists, linguists, molecular biologists, neurologists and also engineers, physicists and mathematicians (computa-tional modelling of cognitive

neuroscience is an aspect of the subject that it has not been possible to address in this short introduction). The results of these studies should help us not only design more effective rehabilitation, but also enhance fundamental aspects of daily life, especially in education.

/// Figure 1. The process of language understanding ////////////////////////////////////////////////////////

In the process of understanding language, the brain uses network interactions to compute different types of units. Starting with the extraction of the physical characteristics of speech (in fuchsia), the brain identifies the patterns that make up the phonemes and their sequences. It computes the various units according to the specific properties of each language (in orange). Thus for a Spanish speaker the sequence “Melo” is not a word, but could be one, whereas the sequence “Mleo” is impossible. Specific knowledge of words is represented at different levels in the brain (in purple and green in the figure). Sentence comprehension (not shown in the figure on the right) involves computing activation patterns corresponding to syntactic-semantic properties (for example, the need, or not, for the preposition “a” in Spanish according to whether a human being or an object is involved). The final steps in language understanding imply the ability to integrate what we hear with our “understanding of the world” (the anomaly in the sentence “the Deer shot the hunter” is of this type). In the case of pathologies, some components may be altered, but not others, indicating differences in how the underlying structures are affected. Thus, for example, sufferers of Alzheimer’s disease may have serious difficulties integrating knowledge of the world or meaning of words, but retain their knowledge of phonetic-phonological analysis.

Auditory analysis

Parsing

Semantic analysis

Knowledge of the world The deer shot the hunter

Melo vs Mleo

Lexical analysis

Phonetic-phonological analysis

Source: image of the brain adapted from Poeppel D., Monahan P. J., Current Directions in Psychological Science 2008;17:80-85.

I met my brotherI found my pencil

90 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

05.3 … IN STUDIES INTO SOCIETY AND CULTURE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

For a long time, it was common to think that a s i ng l e and un ique

‘modern society’ originated in the West and that i t opened up a new and better era in the history of human-

Modernity: understanding our present time

Peter Wagner

Universitat de Barcelona

The most common view of modernity – although it is not without its problems – holds that the term refers to a new type of society that emerged from a sequence of major transformations in Europe and North America culminating in the industrial and democratic revolutions in the late eighteenth and early nineteenth centuries. In this article, Peter Wagner reviews the ways in which Sociology and Philosophy have tried to understand modernity, in order to help us understand our own modernity.

ity. Today, we need to rethink these ideas in the light of the current global condition of mode r n i t y. Mode r n i t y ’s c la ims and expectat ions have become inescapable in ever more walks of life and

for many more people than ever. In the course of their real isat ion and diffusion, however, these claims and expectations have also been radically transformed. New issues have arisen, the fol-

lowing of which are central for our time.

First, sociologists and phi-losophers have long main-tained that there is – indeed: that there can only be – a

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 91

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN STUDIES INTO SOCIETY AND CULTURE 05.3

single model of modernity. However, modern institutions and pract ices have not remained the same over time, and furthermore there is now a variety of forms of modern socio-polit ical organisat ion. What does this entail for our idea of progress, or in other words, for our hope that the future world can be better than the present one?

Second, modern i ty was based on the hope for free-dom and reason, but it cre-ated the institutions of con-temporary capitalism and democracy. How does the freedom of the citizen relate to the freedom of the buyer and seller today? And what does disaffection with capit-alism and democracy entail fo r the susta inab i l i t y o f modernity?

Th i rd ly, our concept o f modernity is in some way inextricably tied to the history of Europe and the West. How can then we compare differ-ent forms of contemporary global modernity in a ‘sym-metric’, non-biased or non-Eurocentric way? How can we develop a world-soci-ology of modernity?

A look at the ways in which sociology and philosophy have tried to understand modernity can guide us in the

attempt to understand our present modernity.

The most common – even though far from unproblem-atic – view about modernity holds that this term refers to a novel kind of society that emerged from a sequence of major transformations in Europe and North America culminating in the industrial and democratic revolutions in the late eighteenth and early nineteenth centuries. Significantly, this view often entails both that these trans-f o r m a t i o n s c a t a p u l t e d Europe (or the West) to the front position in the course of world history and that the thus established Western model would diffuse world-wide because of its inherent

superiority. Thinking about modernity thus meant think-ing about globalisation, even though these terms have come into frequent use only since the 1980s and 1990s respectively.

Global – or universal – signif-icance was c la imed for European modernity from the very beginning. A key event in the formation of what we consider as mod-ern Europe was the so-ca l l ed d iscovery o f the Americas with their hitherto unknown populations, and this event triggered Euro-pean reflections about the nature of humankind and provided a background to philosophical speculations about the ‘state of nature’,

Peter Wagner

Peter Wagner is ICREA Research Professor of Sociology at the University of Barcelona and Principal Investigator of the research project ‘Trajectories of modernity’ (TRAMOD), funded by the European Research Council. His research focuses on questions of social and political theory, comparative-historical and political sociology and the sociology of knowledge. In particular, he has aimed at comparatively analysing the history of European and non-European societies in terms of transformations of modernity. Before coming to Barcelona, he has been Professor of Sociology at the Universities of Trento, Italy, and Warwick, United Kingdom, as well as Professor of Social and Political Theory at the European University Institute, Florence. His publications include Plurality and progress: modernity in political philosophy and historical sociology (Malmö: NSU Press, 2010); Modernity as experience and interpretation: a new sociology of modernity (Cambridge: Polity, 2008), Varieties of world-making: beyond globalization (ed. with Nathalie Karagiannis, Liverpool: Liverpool University Press, 2007), Theorizing modernity and A history and theory of the social sciences (both London: Sage, 2001) and A sociology of modernity: liberty and discipline (London: Routledge, 1994).

A key event in the formation of what we consider as modern Europe was the so-called discovery of the Americas with their hitherto unknown populations

Peter Wagner.

92 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

05.3 … IN STUDIES INTO SOCIETY AND CULTURE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

as in John Locke’s Second t rea t i se on government (1690). From René Des-c a r t e s ’s D i s c o u r s e o n method (1637) onwards, E n l i g h t e n m e n t t h o u g h t claimed to have established the very few, but absolutely firm foundations on which universal knowledge could be erected, most basically freedom and reason. The American and French revolu-tions were seen as having i nescapab ly i n t roduced humanity to liberal democ-racy, based on individual rights and popular sover-eignty. Already in his Democ-racy i n Amer ica o f the 1830s, Alexis de Tocqueville considered equal universal suffrage as the end-point towards which political his-tory would be moving. And from Adam Smith’s Wealth of nations (1776) to the mid-nineteenth century, political economists claimed to have discovered in market self-regulat ion an absolute ly superior form of economic organisation. In the Commu-nist Manifesto (1848), Karl Marx and Friedrich Engels provided an image of eco-nomic globalisation whose evocative power has not been surpassed ever since.

A common basic under-standing of modernity under-lies this debate that stretches over two centur ies and

addresses very d i fferent aspects of human social life. Modernity is the belief in the freedom of the human being – natural and inalienable, as many ph i losophers pre-sumed – and in the human capacity to reason combined with the intelligibility of the world, that is, its amenability to human reason. In a first step towards concreteness, this basic commitment trans-lates into the principles of individual and collective self-determination and in the expectation of ever increas-ing mastery of nature and ever more reasonable inter-ac t i on be tween human beings. The Declaration of the rights of man and of the citizen (1793) as well as the granting of commercial free-dom can be understood as applications of these under-lying principles of modernity as can the technical transfor-mations captured by the term Industrial Revolution.

These principles were seen as universal , on the one hand, because they con-tained normative claims to which, one presumed, every human being would sub-scribe, and on the other, because they were deemed to permit the creation of functionally superior arrange-ments for major aspects of human soc ia l l i fe , most importantly maybe the satis-

faction of human needs in market-driven production and the rational government of collective matters through law-based and hierarchically organised administration. Furthermore, they were seen as globalising in their appli-cation because of the inter-p re ta t i ve and p rac t i ca l power of normativity and functionality.

None of these claims, how-ever, was straightforwardly accepted. Even though the intellectual commitment to these principles was pos-sibly widespread, doubts always existed about the possibility or probability of translating these principles into institutional arrange-ments without considerable modifications and losses. Immanuel Kant and Kar l Marx were among the early critics. The former was com-mitted to the idea of enlight-ened and accountable gov-ernment and expected the republican principle to flour-ish worldwide. However, he did not believe in what might have been considered the crowning of such process, the creat ion o f a wor ld republic, but argued for the normative superiority of a global federation of repub-lics instead (On perpetual peace, 1795). Karl Marx’s ‘critique of political econ-omy’ (thus the subtitle of

Modernity is the belief in the freedom of the human being – natural and inalienable, as many philosophers presumed – and in the human capacity to reason combined with the intelligibility of the world, that is, its amenability to human reason

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 93

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN STUDIES INTO SOCIETY AND CULTURE 05.3

Capi ta l , 1867) , in tu rn , undermined the belief that the transformation of the human being into a market agent was based on the pr inc ip les of l iberty and equality, as political econ-omy had suggested. This nove l soc ia l f o rmat ion , which he re fer red to as bourgeois society, rather divided humankind into two c lasses , the owners o f means of production and those who had only their labour power to sell, who stood in an increasingly antagonistic relation to each other.

By the beginning of the twentieth century, the trajec-tory of European (or West-ern) societies had separated so considerably from those of other parts of the world that the par t icu la r i ty o f ‘Occidental rationalism’, as Max Weber put it, though not without hesitation, had become a key topic of his-torico-sociological investiga-t i on . The amb igu i t y o f Webe r ’s t e rm ino log i ca l choice should stay with the debate on modernity ever s ince. Weber seemed to claim both that rationalisa-tion had Western origins and even preconditions in West-ern cosmology and that it had ‘universal significance’, the latter not without adding the much overlooked paren-

thesis ‘as we are inclined to think’. Thus, it permitted both the promoters of mod-ern isat ion theory in the 1960s and the more recent advoca tes o f ‘mu l t i p l e modern i t ies ’ to re fer to Weber as the main source of insp i ra t ion. The former, headed by Talcott Parsons, suggested that the Western ‘breakthrough’ to modernity would be emulated by elites in other societies because of its normative and functional superiority and that therefore Western modernity would diffuse globally in processes of ‘modernisation and devel-opment’, as the sociological jargon of the 1960s had it. The latter, inspired by the late Shmuel N. Eisenstadt, were not denying the ‘uni-versal significance’ of West-ern social transformations since the 1700s, but held that the encounter of other civi l izations with Western modernity did not lead to the mere diffusion of the West-ern model but rather to the proliferation of varieties of modernity generated by the encounter of different ‘cul-tural programmes’, which had consolidated much ear-lier, with Western ideas and practices.

The opposi t ion between neo-modernization theory and the multiple modernities theorem, which marks cur-

rent sociological debate on modernity, tends to sideline the third aspect of Weber’s view of ‘Occidental rational-ism’, namely a profound scepticism as to the fate of modernity. From this angle, Weber’s reflections stand mid-stream in the tradition of a profound crit ique of modernity that was elab-orated between the middle of the nineteenth and the mid-dle of the twentieth century, with Karl Marx marking the beginning and Theodor W. Adorno the end, at least in i t s s t rong fo rm, o f th is approach. Marx accepted the modern commitment to freedom and reason, as his expectation of a future ‘free association of free human beings’ shows, but empha-sized the impossibi l i ty of realising it under conditions of class domination. Market liberty in bourgeois society would lead to alienation and commodif icat ion, human beings relating to each other as things. Similarly, Weber saw the Protestant Reforma-tion as an increase of indi-vidual autonomy, eliminating the institutional mediation of the church between the believer and God (The Prot-estant ethic and the ‘spirit’ of capi ta l ism , 1904/05) . Once the social ethic associ-ated with Protestant ism, which emphasizes profes-sional commitment and suc-

The Declaration of the rights of man and of the citizen (1793) as well as the granting of commercial freedom can be understood as applications of these underlying principles of modernity

94 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

05.3 … IN STUDIES INTO SOCIETY AND CULTURE |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

cess, had contr ibuted to bringing about the institu-tions of modern capitalism, however, a rationalised con-d u c t o f l i f e w o u l d b e imposed on the inhabitants of the ‘dwellings made of steel’ (the common render-ing of Weber’s stählernes Gehäuse as ‘iron cage’ is rather misleading) character-istic of modernity. Adorno and Max Horkheimer (Dia-lect ics of Enl ightenment , 1944) provided the most extreme version of the idea that the modern commit-ment to freedom and reason tends towards self-cancella-tion in its transformation into historically concrete social forms. They see the origins of this regression in the very philosophy of the Enlighten-ment that, in its insistence on the knowabil ity of the world, transforms all quali-ties into mere quantities of the same and reduces the unknown to the status of a variable that is subject to the rules of mathematical equa-tions. Such conceptualisa-tion entered into a totalising alliance with industrial capit-alism and produced, by the middle of the twentieth cen-tury, a society dominated by culture industry in which nothing could be heard or touched that had not been heard or touched before. Novelty and creativity were equally eliminated in socie-

ties as otherwise different as the mass cu l ture of the United States, Nazi Ger-many or the Stalinist Soviet Union.

Such radical cr it iques of modernity gradually lost their persuasive power during the second post-war period of the twentieth century. An echo is found in Herbert Mar-cuse’s analys is of ‘one-dimensional man’ and ‘one-dimensional society’ (1964), a diagnosis the reception of which in the student revolt of the late 1960s both demon-strated its appeal and tended to undermine is validity since the cultural revolut ion of ‘1968’ arguably (re-)intro-duced a plurality of dimen-sions into the contemporary world. When Zygmunt Bau-man revived the analysis of modernity as the obsessive attempt to create order and e l i m i n a t e a m b i v a l e n c e (Modernity and the holo-caust, 1989; Modernity and ambivalence, 1991), he did so partly in historical per-spective offering a novel view on the Nazi genocide of the European Jews as an utterly modern phenomenon and partly situated his own writ-ings at the exit of such organ-ised modernity towards a post-modernity that reintro-duced a concern with free-dom, even though possibly a transformed and reduced

one compared to ear l ier promises.

Such views about modernity undergoing a major transfor-mation had indeed arisen from the late 1970s onwards, pioneered by Jean François Lyotard’s Postmodern condi-tion (1979). Lyotard radical-ised the earlier sociological debate about a transforma-tion from industrial to post-industrial society, promoted by authors such as Ray-mond Aron and Daniel Bell, by sugges t ing tha t the emerging social configura-tion was of such novelty that established concepts could no longer grasp it. Thus, his work contributed to launch-ing a broad investigation, which has character ised much of political philosophy and comparative-historical sociology since, into the openness of the modern commitment to freedom and reason to a plurality of pos-ssible interpretations. As a con-sequence, the earlier oppos-ition between an affirmative view of modernity as the institutionalisation of free-dom and reason, on the one hand, and the critical analy-sis of the self-cancellation of the modern normative com-mitment could now be re-read as evidence of, first, the ambiguity of the conceptual underpinnings of modernity and, second, the variety of

Radical critiques of modernity gradually lost their persuasive power during the second post-war period of the twentieth century

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 95

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| … IN STUDIES INTO SOCIETY AND CULTURE 05.3

possible translations of those commitments into institu-tionalised social practices, such as democracy and capitalism.

The doubts about the stabil-ity and superiority of West-ern, ‘modern society’ were not confined to theoretical reflections during this period. The late 1970s and early 1980s were the years in which the Iranian revolution brought an end to the idea that non-Western societies were just a bit behind on the same modernising trajectory on which Western ones had embarked; the rise of the Japanese economy – and later the Taiwanese, South Korean and now the Chinese one – suggested that a cap-italism with non-Protestant cultural backgrounds could compete successfully with the allegedly more advanced economies; the rise of neo-liberal ideologies (monetar-ism and supply-side eco-nomics as they were then known) to governing power in the UK and the US and the concomitant failure of eco-nomic policy by a socialist-led government in France signaled the end of the opti-mism that market economies could smoothly be steered by national governments. Furthermore, these years were bracketed by the stu-dent, workers and civil rights

movements of the late 1960s that suddenly interrupted the tranquility of the apparent postwar social consensus, on the one side, and by the co l lapse of Sov ie t-s ty le socialism between 1989 and 1991, on the other side. There was plenty of everyday evidence at hand that sug-gested the need to interro-gate anew the contemporary human condition.

Such observa t ions and reflections gave new impetus to research on moder nity. In political philosophy and social theory, the nature of the ambiguity and thus plurality of the modern com-mitment requires further investigation, not least with a view to understanding the degree of openness of this commitment to interpreta-tion and to reviewing, not necessarily discarding, the universalist claims that had accompanied this commit-ment from its beginnings. In sociological research, the hypothesis of a recent major transformation of ‘modern soc ie t i es ’ be tween the 1960s and the present has informed many analyses f r o m t h e m i d - 1 9 8 0 s onwards. Such research needs to address in particu-lar the quest ion whether such transformation, if it is ongoing, shows a specific direction breaking with or

confirming the tendencies of modernity as they had been postulated in earlier theoris-ing. F inal ly, comparat ive sociology of modernit ies needs to investigate whether the observable plurality of modern forms of socio-polit-ical organisation is created by specific historical trajec-tories and to explore the conditions for persistence of such plurality under current conditions of globalization.

This threefold task is remind-ful of the interpretat ions given to Weber’s reflections on modernity, but the cur-rent condi t ion of g loba l modernity tends to sharpen the issues raised in earlier theorising. The plurality of modern forms may lend itself to varieties of compet-ing world-making projects, but at the same time the often-observed homogenis-ing tendencies of globalisa-tion may impose a return to the view of modernity as a single and unique form of social and political organisa-tion that is without lasting alternatives. In that latter case, though, the critique of modernity may emerge in a new guise, as a critique of alienating individualisation and commodification that entails the risk of loss of world as a meaningful dwell-ing space, the risk of world-lessness.

The late 1970s and early 1980s were the years in which the Iranian revolution brought an end to the idea that non-Western societies were just a bit behind on the same modernizing trajectory on which Western ones had embarked

06Forum

98 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

06 FORUM |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

Frontier research: bringing the future closerJavier Rey Director of the Fundación General CSIC

In his book “The Structure of Scientific Revolutions” Thomas Kuhn describes a model of the dynamics of science

characterised by periods of calm, which he calls normal science, governed by a particular paradigm which provides a conceptual and methodological frame-work with which to find the answers to questions considered relevant within the context of that paradigm. These periods of calm are interrupted when anomalies in the paradigms emerge, questions arise that cannot be answered, or findings and observations are made that contradict one or more of the principles under-pinning the paradigm. In Kuhn’s view, it is in these phases of crisis and agitation that paradigm shifts take place: i.e. scientific revolutions which change the course of development of science in a particular area. Sometimes these paradigms are not directly comparable with those they have displaced (something which Kuhn another philosophers of science, such as Paul Feyerabend and to some extent Imre Lakatos, have called the incommen-surability of paradigms, a controversial concept, subsequently discussed and qualified by Kuhn) and may even coexist, although describing different realities or operating on different scales, such as

Newtonian mechanics and relativistic mechanics.

Towards the end of his life, Kuhn moved away from the notion of scientific revolu-tions understood in terms of the process of stabilisation and destabilisation of para-digms, and embraced a somewhat dif-ferent concept based on conceptual and methodological isolation of scientific dis-ciplines from one another. The disciplines end up splitting off from one another not only because they use different language, but because the concepts and meth-odological frameworks are so different that they end up inhabiting different universes.

Although the evolution –rather than revo-lution– of science has important com-

ponents that are more sociological than epistemological, the Kuhnian concept of a paradigm can be useful when address-ing the subject of frontier research, since the gradual creation of new paradigms that open the way for unexpected dimen-sions of knowledge to emerge is a prop-erty that, while not exclusive to frontier research, is highly characteristic of it.

What is frontier research?A simple definition of frontier research could be “research taking place at the frontiers of knowledge,” perhaps qualify-ing this definition with “in a particular area or field.” This definition is clearly some-what imprecise, however. Indeed, argu-ably it dodges the question, as it pushes the definition back a step, to the question of the frontiers of knowledge, and fails to offer a distinguishing feature with which to identify frontier research as all research is about something unknown and can potentially contribute new knowledge. Therefore, on this basis all research would be frontier research, because it takes place at the frontier between what is known and what is not. This conclusion is obviously unsatisfactory as we want to be able to distinguish between genuine fron-tier research and other research which we

Javier Rey.

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 99

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| FORUM 06

might call “mainstream” (or normal sci-ence in Kuhn’s terms).

It may be helpful to list a few of the aspects that characterise frontier research. For example, 1) it usually addresses issues about which there is considerable contro-versy in the scientific community in the area in which they are being explored; (2) it deals with questions that are hard to answer, at least by applying the normal methodological approaches; (3) it employs methodologies and concepts that are atypical for the field concerned; (4) it takes unexpected findings that challenge the dominant paradigm as its starting point; (5) and, continuing from the previous point, it focuses on issues whose resolution is key to confirming (or rebutting) the prevailing paradigm; and (6) it involves research with a very high degree of uncertainty as to its likelihood of success, and so on. Not all frontier research meets all these criteria. What most often seems to be the common feature of frontier research is its potential to transform and put our understanding on a new footing. It has the ability to yield results which represent a sig-nificant step forward in our knowledge, generating new paradigms that open the door to new approaches and ways of thinking, new questions and issues, approaches which are not possible in the so-called standard framework of science, i.e. mainstream science rather than frontier science. Also, frontier research tends to involve high costs and faces a high risk of failure.

Life at the frontier is toughSome frontier research is very popular, such as that driven by the prevailing para-digm. i.e. research whose results can support or refute the paradigm. In other cases, this involves research taking place

in highly innovative methodological and conceptual frameworks whose potential is recognised by mainstream research, but which are not sufficiently well estab-lished in the scientific community con-cerned. In this case pioneering research fields can turn into complete fields of knowledge in their own right.

However, there are other areas of frontier research in which the potential to gener-ate knowledge is hard to estimate This is precisely because this research is pur-sued in an area which is intrinsically diffi-cult to investigate, with little background, atypical approaches, and in many cases with relatively few researchers and experts working on it. The results are often unpredictable. What is more, much of this research is often not well received by the majority’s status quo or may even be completely ignored. The researchers who venture to undertake this research are explorers, blazing trails without know-ing where they will end up.

Today’s R&D systems are not conducive to frontier research, precisely because of its intrinsic characteristics. Being high risk, it does not often receive a good score from the committees making research funding decisions, as they need to see preliminary results supporting the research proposal. Moreover, there is often also a shortage of experts or peo-ple sufficiently knowledgeable about the proposed research, such that peer review, the defining feature of most of today’s scientific systems, is unable to operate. For the same reasons as under-lie its difficulties obtaining funding, frontier research also finds it difficult to gain trac-tion within the scientific community by disseminating its findings through publi-

cations or communications. This means that, no matter how stimulating or attract-ive it may be, frontier research tends not to be the path chosen by most scientists and, more worrying still, it tends to be shunned by young scientists, who have the biggest potential for creativity and innovation. This is precisely because of its very high risk of failure and conse-quent exclusion from the mainstream research community. Current systems of scientific promotion impose severe pen-alties for failure, thus discouraging risk taking. This compounds the intrinsic diffi-culty that usually accompanies frontier research, and explains why the intensity of research at the frontier is generally less than in what Kuhn calls normal science.

This approach, or perhaps attitude, how-ever, potentially has very damaging effects on the scientific system in the long term and even in the medium term. Frontier research is the seedbed for the new ideas to draw on when the moment of crisis Kuhn talked about arrives. If we want a genuine knowledge transform-ation we need to encourage frontier research and recognise and foster the spirit of critical enquiry among our young scientists. We need new ideas and meth-odological and conceptual approaches as well as the excellent technical training that mainstream science can provide. Otherwise we will not be able to respond to the next question or the next unex-pected finding that arises, or the next innovative challenge we meet, as this calls on the R&D system having know-ledge that at present we can neither sus-pect nor imagine. Frontier research has this ability, to bring the future to the present, even though its practitioners themselves may be unable to anticipate it.

07News

102 | LYCHNOS | Nº 5 | Notebooks of the Fundación General CSIC

07 NEWS |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

The Fundación General CSIC has launched “Workshops FGCSIC” to create an oppor-tunity for high level meetings be tween p ro fess iona l s , researchers and other stake-holders in fields such as tech-nology watch, competitive intelligence, strategic planning applied to R&D, scientific fore-sight or science and technol-ogy output metrics.

FGCSIC Workshops

The first workshop in the series was held on 25 May, at Caixa Forum Madrid, and dealt with the topic of “Key aspects of technology watch and com-petitive intell igence.” The events are designed as a series of workshops, each subdivided into three parts: starting with a short talk by a specialist to set the context and highlight the main lines of

each topic; followed by hands-on work on a practical case by the participants, rounded off with a discussion and sharing of conclusions. The invited speakers were Eliana Ben-jumeda, director of Infoline and expert in competi tive intelli-gence at Acciona; Mario Este-ban, manager of Competitive Intelligence at Acciona; Clara Parapar, FGCSIC project man-ager and the designer of the workshop’s content, and Ramón Maspons, director of the Projects Unit at BioCat.The next Workshop FGCSIC will be on science/technology output metrics and will be held in September.

Visit our microsite on the FGCSIC’s workshops at http://www.fgcsic.es/workshops

To accompany the Inter-national Year of Research into Alzheimer’s and other neuro-degenerative diseases, the Fundación General CSIC is taking part in the science com-munication TV programme “Claves de Alzheimer”, first shown on the Internet televi-sion channel Indagando.tv.

“Claves de Alzheimer” is broad-cast online weekly on Fridays, starting at 10.00 a.m. The pro-grammes are 20 minutes long and are repeated every two hours. The series of thirteen pro-grammes examines the main characteristics of the disease and reviews the latest lines of basic and clinical research. The programmes also look at the so-cial and health-care aspects of Alzhiemer’s disease.

Watch the programmes on: Indagando tv Internet channel: http://www.indagando.tv/ The FGCSIC’s channel on YouTube: http://www.youtube.com/fgcsic

“Claves del Alzheimer” TV series

Carrying on from its report on R&D relating to Ageing, the Foundation is running a

survey aimed at researchers working in the Spanish R&D system, particularly those in

FGCSIC survey on Ageingfields relating to ageing. The aim of the survey is to gather information enabling a scien-tific foresight exercise to be run on ageing-related fields, in order to identify today’s main scientific challenges and the likely direction of future research . The survey will be conducted online via a web-based application.

Notebooks of the Fundación General CSIC | Nº 5 | LYCHNOS | 103

|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| NEWS 07

The FGCSIC teams up with MIT for the Spanish tr35 awardsTechnology Rev iew , the Massachusetts Institute of Technology (MIT)’s flagship publication, in conjunction with a group of institutions including Fundación General CSIC, is extending the tr35 awards to Spain for the first time, to identify the country’s best talent in the innovation field. The tr35 Spain initiative

is sponsored by the BBVA’s Innovation Centre and coordi-nated by Opinno, Open Innovation.

FGCSIC, networked

The Fundación General CSIC has registered on the Span-ish Biotech Platform. It is also present on the Auto-motive components industry

technology platform (SERTEC-M2F). It has also joined the profes-sional networking site Linkedln.

José Luis de Miguel, president of LES España-PortugalJosé Luis de Miguel Antón, dep-uty director of the Fundación General CSIC has been ap-pointed president of LES Es-paña-Portugal (Licensing Exec-utives Society International Inc.), an association focusing on pro-fessional and business practice.

LES Internacional is recognised by the World Intellectual Proper-ty Organisation (WIPO).

See the LES España-Portugal (Licensing Executives Society International Inc.) website at: http://www.lesi.org

Meeting of the trustees

On 18 May the trustees of the Fundación General CSIC met at the offices of the Span-ish National Research Coun-cil (Consejo Superior de In-vestigaciones Científicas). The agenda included approving the Foundation’s Activity Report for 2010 and a presentation of the report on the current progress of the 2011 Action Plan.

Proyectos Cero on Ageing

The panel of judges will select the top ten innovators aged under 35 working in Spain whose technical work has been successfully applied in recent years or which has big potential for development over the coming decades. Candi-dates are nominated by their colleagues in business or academia.

The award ceremony will take place in Málaga on 26-27 Oc-tober 2011.

For more, see the tr35 Spain awards website at: www.tr35spain.com

The closing date for the sub-mission of full proposals for projects shortlisted in the first phase of the Proyectos Ce-ro call for proposals on ageing was on 31 May.

For more, see the FGCSIC microsite on Ageing at: http://www.fgcsic.es/envejecimiento/en_EN

The central goal of the BBVA Foundation’s activity is to support world-class scientific research, music, artistic and literary creation, and the humanities. Science, technology, music and art, and their academic study in the framework of the humanities, form a continuum that acts to shape the culture and sensibility of our time.

The BBVA Foundation promotes knowledge through managed programs that take in research projects, advanced training, and the relaying to society of the products of these research and creative endeavors. Its focus areas are the environment (biodiversity, climate change), biomedicine, basic sciences and technology, economy and society, classical and contemporary music, literature, plastic arts and the humanities.

The BBVA Foundation also recognizes the achievements of researchers and artists through a series of award schemes. The BBVA Foundation Frontiers of Knowledge Awards, run in collaboration with the CSIC and currently into their third year, honor outstanding contributions at international level that have significantly enlarged the sphere of knowledge in the following eight fields. Basic Sciences (Physics, Chemistry, Mathematics), Biomedicine, Ecology and Conservation Biology, Information and Communication Technologies, Economics, Finance and Management, Contemporary Music, Climate Change, and Development Cooperation.

Through these varied activities, the BBVA Foundation puts into practice one of the BBVA Group’s core principles: to work for a better future for people through the ongoing promotion of knowledge and innovation.

www.fbbva.es

Science and the Humanities

No

teb

oo

ks o

f th

e Fu

ndac

ión

Gen

eral

CS

IC /

Jun

e 20

11

Notebooks of the Fundación General CSIC / Nº 5 / June 2011 / Published quarterly / Price: 9 euros

05

||||||||||||||||||||||||||||||||

4Frontier research

12On matter and energy

32In the exploration of the universe

54In the borderlands

78In studies into society and culture