x-risks prevention plan

84
A Roadmap: Plan of action to prevent human extinction risks Alexey Turchin www.existrisks.org

Upload: avturchin

Post on 11-Apr-2017

79 views

Category:

Science


1 download

TRANSCRIPT

Page 1: X-risks prevention plan

A Roadmap: Plan of action to prevent human extinction risksAlexey Turchin

www.existrisks.org

Page 2: X-risks prevention plan

X-Risks

Page 3: X-risks prevention plan

Exponential growth of technologies

Unlimited possibilities

Including destruction of humanity

X-Risks

Page 4: X-risks prevention plan

Main X-Risks

Large scale nuclear war

Runaway global warming

Nanotech - grey goo

AI

Synthetic biology

Nuclear doomsday weapons

and large scale nuclear war

Page 5: X-risks prevention plan

Contest: more than 50 ideas were added

David Pearce and Sakoshi Nakamoto

contributed

Crowdsoursing

Page 6: X-risks prevention plan

International

Control

System

Plan A is complex:

Friendly

AI

Rising

Resilience

Space

Colonization

Decentralized

monitoring of

risks

We should do all it simultaneously:

international control, AI, robustness, and space

Plan A1.1 Plan A2 Plan A3 Plan A4Plan A1.2

Page 7: X-risks prevention plan

Survive

the

catastrophe

Plans B, C and D have smaller chances on success

Leave

backups

Improbable

ideas

Bad

Plans

Plan B Plan C Plan D

Page 8: X-risks prevention plan

Research

Social

Support

International

Cooperation

Risk

Control

Worldwide risk

prevention

authority

Plan A1.1International Control System

Active

Shields

Page 9: X-risks prevention plan

Plan A1.1International Control System

Planing

Step 1

Research

• Long term future model

• Comprehensive list of risks, with probability

assessment and prevention roadmap

• Wiki and internet forum

• Integration of approaches, funding,

education, translation

• Additional studies areas: biases, law

Page 10: X-risks prevention plan

Plan A1.1International Control SystemStep 2

Preparation

Social

support

• %QQRGTCVKXG�UEKGPVKƒE�EQOOWPKV[� peer-reviewed journal, conferences,

intergovernment panel,

international institute

• Popularization (articles, books,

forums, media)

• Public support, street action

����̵PVK�PWENGCT�RTQVGUVU�KP���U�• Political support: lobbyism, parties

Page 11: X-risks prevention plan

Plan A1.1International Control SystemStep 3

International

Cooperation

First level of

defence on

low-tech level

• #NN�UVCVGU�EQPVTKDWVG�VQ�VJG�70�VQ�ƒIJV�UQOG�INQDCN�risks

• Group of supernations takes responsibility of x-risks

prevention

• International law about x-risks

• International agencies dedicated to certain risks

• A smaller catastrophe could help unite humanity

Page 12: X-risks prevention plan

Plan A1.1International Control SystemStep 3

Risk

control

First level of

defence on

low-tech level

• International ban on dangerous technologies or

voluntary relinquishment

• Freezing potentially dangerous projects for 30 years

• Concentrate all bio, nano and AI research in several

controlled centers

• Differential technological development: develop

UCHGV[�CPF�EQPVTQN�VGEJPQNQIKGU�ƒTUV�$QUVTQO�

Page 13: X-risks prevention plan

Plan A1.1International Control SystemStep 4

Worldwide risk

prevention

authority

Second level of

defence on

high-tech level

• Center for quick response to any emerging risk; x-risks police

• Worldwide video-surveillance and control

Ů�2GCEGHWN�WPKƒECVKQP�QH�VJG�RNCPGV�DCUGF� on a system of international treaties

• Narrow AI based expert system on x-risks, Oracle AI

• Control over dissemination of knowledge of mass

��FGUVTWEVKQP��ETGCVG�YJKVG�PQKUG�KP�VJG�ƒGNF

Page 14: X-risks prevention plan

Plan A1.1International Control SystemStep 4

Active

shields

Second level of

defence on

high-tech level

• Geoengineering anti-asteroid shield

• Nano-shield – distributed system of

control of hazardous replicators

• Bio-shield – worldwide immune system

• Mind-shield - control of dangerous

ideation by the means of brain implants

• Worldwide monitoring, security and control

��U[UVG�PQKUG�KP�VJG�ƒGNF

Page 15: X-risks prevention plan

Risks of plan A1.1

2NCPGV�WPKƒECVKQP�YCT Fatal mistake in world

control system

Global catastropheGlobal catastrophe

Active

Shields

Worldwide risk

prevention authority

Page 16: X-risks prevention plan

ResultSingleton

«A world order in which there is a single

decision-making agency at the highest level» (Bostrom)

Worldwide government system based on AI

Super AI which prevents all possible risks and provides

immortality and happiness to humanity

Colonization of the solar system, interstellar travel and

Dyson spheres

Colonization of the Galaxy

Exploring the Universe

Page 17: X-risks prevention plan

Improving

human

intelligence

and morality

Plan A1.2Decentralised risk monitoring

Values

transformation

Cold war

and WW3

prevention

Decentralized

risks

monitoring

Page 18: X-risks prevention plan

Plan A1.2Decentralised risk monitoringStep 1

Values

Transformation

• The value of the indestructibility of the civilization

��UJQWNF�DGEQOG�ƒTUV�RTKQTKV[�QP�CNN�NGXGNU• Reduction of radical religious (ISIS) or nationalistic

values

• Popularity of transhumanism

• Movies, novels and other works of art that honestly

depict x-risks and motivate to their prevention

• Memorial and awareness days: Earth day, Petrov

day, Asteroid day

Page 19: X-risks prevention plan

Plan A1.2Decentralised risk monitoringStep 2

Improving

human

intelligence and

morality

• Higher IQ, New rationality, Fighting cognitive biases

• High empathy for new geniuses, lower proportion

of destructive beliefs

• Engineered enlightenment: use brain science

• Prevent worst forms of capitalism

• Promote best moral qualities

Page 20: X-risks prevention plan

Plan A1.2Decentralised risk monitoringStep3

Cold war

and WW3

prevention

Dramatic

social

changes

• International conflict management authority like

international court

• Large project which could unite humanity

• Antiwar and antinuclear movement

• Cooperative decision theory in international politics

• Prevent brinkmanship

• Prevent nuclear proliferation

Page 21: X-risks prevention plan

Plan A1.2Decentralised risk monitoringStep 4

Decentralized

risks

monitoring

• Transparent society: groups of vigilantes,“ Anony

mous” style hacker groups

• Decentralized control: local police,mutual control,

whistle-blowers

• Net based safety: ring of x-risks prevention organi

zations

• Economic stimulus: prizes for any risk found and

prevented

Ů�/QPKVQTKPI�QH�UOQMG��PQV�ƒTG

Page 22: X-risks prevention plan

Plan A2Friendly AI

Solid Friendly

AI theory

AI practical

studies

Seed

AI

Superintelligent

AI

Study

and Promotion

Page 23: X-risks prevention plan

Plan A2Friendly AI

Step 1

Study

and

Promotion

• Study of Friendly AI theory

• Promotion of Friendly AI (Bostrom and Yudkowsky)

• Fundraising (MIRI)

• •Slowing other AI projects (recruiting scientists)

• •FAI free education, starter packages in programming

Page 24: X-risks prevention plan

Plan A2Friendly AI

Page 25: X-risks prevention plan

Plan A2Friendly AI

Step 2

Solid

Friendly

AI theory

• Human values theory and decision theory

• Full list of possible ways to create FAI,

and sublist of best ideas

• Proven safe, fail-safe, intrinsically safe AI

• Preservation of the value system during

• AI self-improvement

• A clear theory that is practical to implement

Page 26: X-risks prevention plan

Plan A2Friendly AI

Step 3

AI

practical

studies

• Narrow AI

• Human emulations

• Value loading

• FAI theory promotion to most AI commands; they

agree to implement it and adapt it to their systems

• Tests of FAI theory on non self-improving models

Page 27: X-risks prevention plan

Plan A2Friendly AI

Step 4

Seed

AI

Creation of a small AI capable

of recursive self-improvement and based

on Friendly AI theory

Page 28: X-risks prevention plan

Plan A2Friendly AI

Page 29: X-risks prevention plan

Plan A2Friendly AI

Step 5

Superintelligent

AI

• Seed AI quickly improves itself and

undergoes “hard takeoff”

• It becomes dominant force on Earth

• AI eliminates suffering, involuntary

death, and existential risks

• AI Nanny – one hypothetical variant of

super AI that only acts to prevent

existential risks (Ben Goertzel)

Singleton

Unfriendly AI

Page 30: X-risks prevention plan

Plan A2Friendly AI

Page 31: X-risks prevention plan

Plan A3Rising Resilience

Improving

sustainability

of civilization

Improving

human

intelligence

and morality

High-speed

tech

development

Timely

achievement

of immortality

AI based on

uploading

of it’s creator

Page 32: X-risks prevention plan

Plan A3Rising Resilience

Step 1

Improving

sustainability

of civilization

• Intrinsically safe critical systems

• Growing diversity

• Universal methods of catastrophe

prevention

• Building reserves (food stocks,)

• Widely distributed civil defence,

Page 33: X-risks prevention plan

Plan A3Rising Resilience

Step 2

Useful ideas

to limit

catastrophe

scale

• Limit the impact of catastrophe: quaran

tine, rapid production of vaccines, grow

stockpiles

• Increase time available for preparation

supporting general research risks,

connect disease surveillance systems

• Worldwide x-risk prevention exercises

• The ability to quickly adapt to new risks

Page 34: X-risks prevention plan

Plan A3Rising Resilience

Step 2High-speed

tech dev.

needed to quickly

pass risk window

• Investment in super-technologies (nanotech, biotech)

• High speed technical progress helps to overcome slow

process of resource depletion

• Invest more in defensive technologies than in offensive

Page 35: X-risks prevention plan

Plan A3Rising Resilience

Page 36: X-risks prevention plan

Plan A3Rising Resilience

Step 4

Timely

achievement

of immortality

Miniaturization

for survival

and

invincibility

• Nanotech-based immortal body

• &KXGTUKƒECVKQP�QH�JWOCPKV[�KPVQ�UGXGTCN�UWEEGUUQT�URGEKGU�capable of living in space

• Mind uploading

• Integration with AI

• Earth crust colonization by miniaturized nano tech bodies

• Moving into simulated world inside small self sustained

computer

Page 37: X-risks prevention plan

Plan A3Rising Resilience

Page 38: X-risks prevention plan

Plan A4Space colonisation

Temporary

asylums in

space

Space

colonies

on large

planets

Colonisation

of the Solar

system

Interstellar

travel

Page 39: X-risks prevention plan

Plan A4Space colonisation

Page 40: X-risks prevention plan

Plan A4Space colonisation

Step 1Temprorary

asylums in

space

• Space stations as temprorary asylums (ISS)

• Cheap and safe launch systems

Page 41: X-risks prevention plan

Plan A4Space colonisation

Step 2Space

colonies on

large planets

Creation of space colonies on the Moon and

Mars (Elon Musk) with 100-1000 people

Page 42: X-risks prevention plan

Plan A4Space colonisation

Step 3Colonization

of the Solar

system

• Self-sustaining colonies on Mars and large

asteroids

• Terraforming of planets and asteroids using

self-replicating robots and building space

colonies there

• Millions of independent colonies inside

asteroids and comet bodies in the Oort

cloud

Page 43: X-risks prevention plan

Plan A4Space colonisation

Step 4

Interstellar

travel

• “Orion” style, nuclear powered “generation

ships” with colonists

• Starship which operate on new physical

principles with immortal people on board

• Von Neumann self-replicating probes with

human embryos

Page 44: X-risks prevention plan

Result

Interstellar distributed humanity

Many unconnected human civilizations

New types of space risks (space wars, planets and stellar

explosions, AI and nanoreplicators, ET civilizations

Page 45: X-risks prevention plan

Plan A4Space colonisation

Page 46: X-risks prevention plan

Plan BSurvive the catastrophe

Preparation Bulding ReadinessHigh-tech

bunkers

Rebulding

civilisation

after

catastrophe

Page 47: X-risks prevention plan

Plan BSurvive the catastrophe

Page 48: X-risks prevention plan

Plan BSurvive the catastrophe

Step 1

Preparation

• Fundraising and promotion

• Textbook to rebuid civilization

(Dartnell’s book «Knowledge»)

• Hoards with knowledge, seeds and raw

materials (Doomsday vault in Norway)

• Survivalist communities

Page 49: X-risks prevention plan

Plan BSurvive the catastrophe

Step 2

Building

• Underground bunkers, space colonies

• Nuclear submarines

• Seasteading

Natural

refuges

• Uncontacted tribes

• Remote villages

• Remote islands

• Oceanic ships

• Research stations in Atarctica

Page 50: X-risks prevention plan

Plan BSurvive the catastrophe

Step 3

Readiness

• Crew training

• Crews in bunkers

• Crew rotation

• Differnt types of asylums

• Frozen embryos

Page 51: X-risks prevention plan

Plan BSurvive the catastrophe

Page 52: X-risks prevention plan

Plan BSurvive the catastrophe

Step 4

Miniaturization

for survival

and invincibility

• • Earth crust colonization by miniatur-

ized nanotech bodies

• Moving into simulated world inside

small self sustained computer

• Adaptive bunkers based on nanotech

Page 53: X-risks prevention plan

Plan BSurvive the catastrophe

Step 5

Rebuilding

civilisation after

catastrophe

• Rebuilding population

• Rebuilding science and technology

• Prevention of future catastrophes

Page 54: X-risks prevention plan

Result

Reboot of civilization

Several reboots may happen

Finally there will be total collapse or a new

supercivilization level

Page 55: X-risks prevention plan

Plan CLeave backups

Time

capsules

with

information

Messages

to ET

civilizations

Preservation

of earthly life

Robot-rep-

licators in

space

Page 56: X-risks prevention plan

Plan CLeave backups

Step 1

Time

capsules with

information

• Underground storage with information

and DNA for future non-human

civilizations

• Eternal disks from Long Now

Foundation (or M-disks)

Page 57: X-risks prevention plan

Plan CLeave backups

Page 58: X-risks prevention plan

Plan CLeave backups

Step 2

Messages to ET

civilizations

• Interstellar radio messages with encoded

human DNA

• Hoards on the Moon, frozen brains

• Voyager-style spacecrafts with information

about humanity

Page 59: X-risks prevention plan

Plan CLeave backups

Page 60: X-risks prevention plan

Plan CLeave backups

Step 3

Preservation of

earthly life

• Create conditions for the re-emergence of

new intelligent life on Earth

• Directed panspermia (Mars, Europe, space

and dust)

• Preservation of biodiversity and highly

developed animals (apes, habitats)

Page 61: X-risks prevention plan

Plan CLeave backups

Page 62: X-risks prevention plan

Plan CLeave backups

Step 4

Robot-replicators

in space

• Mechanical life

• Preservation of information about

humanity for billions of years

• Safe narrow AI

Page 63: X-risks prevention plan

Result

Resurrection by another civilization

Resurrection of concrete people

Creation of a civilization which has a lot of common val-

ues and traits with humans

Page 64: X-risks prevention plan

Plan DImprobable ideas

Saved by

non-human

intelligence

Quantum

immortality

Strange

strategy to

escape

Fermi paradox

Technological

precognition

Manipulation of

the extinction

probability

using

Doomsday

argument

Control of the

simulation

(if we are in it)

Page 65: X-risks prevention plan

Plan DImprobable ideas

Idea 1

Saved by

non-human

intelligence

• Maybe extraterrestrials are looking out for us

and will save us

• Send radio messages into space asking for help

if a catastrophe is inevitable

• Maybe we live in a simulation and simulators

will save us

• The Second Coming, a miracle, or life after

death

Page 66: X-risks prevention plan

Plan DImprobable ideas

Idea 2

Quantum

immortality

• If the many-worlds interpretation of QM is true,

an observer will survive any death including any

global catastrophe (Moravec, Tegmark)

• It may be possible to make almost univocal

correspondence between observer survival and

survival of a group of people (e.g. if all are in

submarine)

• Another human civilizations must exist in the

KƒPKVG�7PKXGTUG�

Page 67: X-risks prevention plan

Plan DImprobable ideas

Page 68: X-risks prevention plan

Plan DImprobable ideas

Idea 3

Strange strategy

to escape Fermi

paradox

Random strategy may help us to escape some

dangers that killed all previous civilizations in

space

Page 69: X-risks prevention plan

Plan DImprobable ideas

Page 70: X-risks prevention plan

Plan DImprobable ideas

Idea 4

Technological

precognition

• Prediction of the future based on advanced

quantum technology and avoiding dangerous

world-lines

• Search for potential terrorists using new

scaning technologies

• Special AI to predict and prevent new x-risks

Page 71: X-risks prevention plan

Plan DImprobable ideas

Idea 5Manipulation

of the extinction

probability

using Doomsday

argument

• Decision to create more observers in case of

unfavourable event X starts to happen, so low-

ering it’s probability (method UN++ by Bostrom)

• Lowering the birth density to get more time for

the civilization

Page 72: X-risks prevention plan
Page 73: X-risks prevention plan

Plan DImprobable ideas

Idea 6

Control of the

simulation

(if we are in it)

• Live an interesting life so our simulation isn’t

switched off

• Don’t let them know what we know we live in

simulation

• Hack the simulation and control it

• Negotiation with the simulators or pray for help

Page 74: X-risks prevention plan

Bad plans

Prevent x-risk

research

because it only

increases risk

Controlled

regressionDepopulation

Unfriendly AI

may be better

than nothing

Attracting good

outcome by

positive

thinking

Page 75: X-risks prevention plan

Bad plans

Idea 1

Prevent x-risk

research

because it only

increases risk

• Do not advertise the idea of man-made global

catastrophe

• Don’t try to control risk as it would only give rise

to them

• As we can’t measure the probability of global

catastrophe it maybe unreasonable to try to

change the probability

• Do nothing

Page 76: X-risks prevention plan

Bad plans

Idea 2

Controlled

regression

• Use small catastrophe to prevent large one

(Willard Wells)

• Luddism (Kaczynski): relinquishment of dangerous

science

• Creation of ecological civilization without

technology (“World made by hand”, anarcho-primi

tivism)

• Limitation of personal and collective intelligence to

prevent dangerous science

• #PVKINQDCNK\O�CPF�FKXGTUKƒECVKQP�KPVQ�OWNVKRQNCT�world

Page 77: X-risks prevention plan

Bad plans

Page 78: X-risks prevention plan

Bad plans

Idea 3

Depopulation

• Could provide resource preservation and make

control simpler

• Natural causes: pandemics, war, hunger (Malthus)

• Extreme birth control

• Deliberate small catastrophe (bio-weapons)

Page 79: X-risks prevention plan

Bad plans

Page 80: X-risks prevention plan

Bad plans

Idea 4

Unfriendly AI

may be better

than nothing

• Any super AI will have some memory about

humanity

• It will use simulations of human civilization to

study the probability of it’s own existence

• It may share some human values and distribute

them through the Universe

Page 81: X-risks prevention plan

Bad plans

Page 82: X-risks prevention plan

Bad plans

Idea 5

Attracting good

outcome by

positive thinking

• Preventing negative thoughts about the end of the world

and about violence

• Maximum positive attitude «to attract» positive outcome

• 5GETGV�RQNKEG�YJKEJ�WUG�OKPF�EQPVTQN�VQ�ƒPF�RQVGPVKCN�terrorists and superpowers to stop them

• Start partying now

Page 83: X-risks prevention plan

Next stage of the research will be creation

of collectively editable wiki-style roadmaps

They will cover all existing topics of

transhumanism and future studies

Create AI system based on the roadmaps

or working on their improvement

Dynamic roadmaps

Page 84: X-risks prevention plan

You could read all roadmaps on:www.immortality-roadmap.com

www.existrisks.org