You are on page 1of 1

Organizations, sites and people involved in x-risks prevention

Known
very well

and large
amount of work
is done

MIRI

(Former Singularity institute)


E.Yudkowsky
link

Still exist!
Were famous in 1970s when they produced Limits of growth Link

OpenAI

Oxford Martin
Programme

CSER

has been created, but not


much people
knows them

Foundational
research
institute

Global catastrophic risks


institute,
Seth Baum

Phil Torrres

Interesting articles by its


main author, focus on existential terrorism and religion
link

Interesting newsletter, many


articles in scientific journals
link

Skoll Global
Threats Fund

Global
challenges

X-risks
net

Leveraged
research

Alexei Turchin
Creating full database on
x-risks and prevention plan

Elon Musk
Want AI safety through Open
AI and human on Mars as a
backup plan

the site is almost empty now


link

link

The Center for International Security and Cooperation is


Stanford Universitys hub for researchers tackling some of the
worlds most pressing security and international cooperation
problems
Nuclear, cybersecurity, bio, antiterrorism, link

To safeguard humanity from


global threats
Climate, water security, pandemics, nuclear proliferation, link

Currently, our research focuses on reducing risks of dystopian futures in the context of
emerging technologies.
Interesting work on AI safety

Convergence

Lifeboat
foundation

Justin Shovelain
Collective think tank concentrated on mathematical modeling of
x-risks
link

Stimson
Center

The Stimson Center is a


nonpartisan policy research
center working to solve the
worlds greatest threats to
security and prosperity.
non-prolifiration
link

Saving
Humanity

Very large scientific boards which


dont actually do anything, but
some useful discussion is going
in its mail list
site

Public
figures

Arctic news
Sam Carana

Irreversible global warming


because of methane hydrates
eruption
link

The Lawrence
Livermore National
Laboratory

has a division called the Global Security Principal Directorate which researches on behalf of the government
issues such as bio-security, counter-terrorism, etc. Link

Forum about risks of flu pandemic

Supports the smartest


minds and most effective organizations to reduce nuclear stockpiles, prevent new
nuclear states, and increase
global security
Link

Impact
risks

Nano
risks

Diffusing nuclear
threat

NASA

Foresight
institute

link

Has its own foundation and vision


of global risks

link

link

Zoltvan
Istavn

Stephen
Hawking
Warned about risks of
aliens and AI

Writers

has its Emerging Pandemic Threats


Program which aims to prevent and
contain naturally generated pandemics
at their source.[129]

Ploughshares
Fund

Bill Gates

investor in x-related
projects, wiki

Invested
in MIRI

includes a division called the Global


Alert and Response (GAR) which monitors and responds to global epidemic
crisis. GAR helps member states with
training and coordination of response to
epidemics. link

flutrackers.com

Small one person organisation


without any actual work
link

Jaan Tallinn

Peter Thiel

Bill Joy
Wrote famous article but now seems to
lost interest

World Health
Organization
(WHO)

The United States


Agency for International Development
(USAID)

from Homo Sapiens

mail list - good one

Sam Altman
Y combinator,
Confounded
Open AI

International
panel of climate
change

Nuclear
threat initiative link

CISAC

link

The Global Challenges Foundation


works to raise awareness of the
Global Catastrophic Risks. Primarily focused on climate change, other
environmental degradation and politically motivated violence as well
as how these threats are linked
to poverty and rapid population
growth, link

and important
figures

GCRI

Famous doomsday
clock
link

Bio-risks

IPCC

Cambridge center of existential risks


Martin Rees, link

on the Impacts of Future


Technology, link

link

Laszlo Szombatfalvy

Investors

X-risks
institute

Bulletin of
atomic
scientists

Club of Rome

Future of humanity institute


Oxford, link
Nick Bostrom

Elon Musk
wiki

Global Priorities Project

Nuclear

Future of life institute


Elon Musk
link

Effective altruism
EA forum

Created Global catastrophic risk


report-2016
Collaborate with UK government
Dr. Toby Ord is member
Connected with EA movement

FHI

FLI

EA

Large and
interesting
research

General x-risks

AI risks

Size and level of


influence

Global
warming

Presidential candidate
from transhumanist party
Wrote about x-risks

Vernor Vinge

Greg Igen
writer
Permutation
city

writer, created
Singularity idea

David Brin

John Barnes

writer,
Existence

Mother of
strorms
Holocen
impact working
group
Estimate risks of recent impacts
link

Scientists
and researchers

Open
places for
discussion

A. Sandberg

Adrian
Kent

Participated in
FHI and co-authored papers

LHC risks

Tobi Ord
site
existential
hope

Milan Circovic
Stevenson probe,
Anthropic shadow
Fermi paradox
Site

Bruce Tonn
Editor and writer
link

Max Tegmark
Wrote articles together
with Bostrom

Norvegian
transhumanists

Lesswrong

Facebook

Existential risks (Adam Ford)


Global Catastrophic Risks Research and Discussion (Evan
Gaensbauer)
Global catastrophic risks
Stop existential risks

Robin
Hanson
Blog
Societal collapse
risks

Katja Grace

Willard Wells

R. Freitas

Fermi paradox
and DA
blog
AI impacts

Author of Apocalypses when


and prevention
plan

Nanotech risks

X-risks
on
Reddit
Existentiarisk
Control problem

Resource depletion risks

David
Denkenberger
agricultural risks

Large group of people working


on AI safety,
including, but not limited to:
Steve Omohundro
Luke Muehlhauser
Stuart Armstrong
Roman Yampolsky
Nate Soares
Vladimir Nesov
Kaj Sotala
Benja Fallenstein
Riva Melissa Tez
Jason Gaverick Matheny, wiki
Andrew Critch, blog
Paul Christiano
Karl Shulman
Anna Salamon

Dennis
Medows

Alexander
Kononov
Coined term
indestructibility of
civilization

R.Carrigan

Aaron Dar

Bill Napier

Risks of SETI

Risks of
supernovas

Risks of dark
comets

Longecity
subforum

Wikiresources

link

LW-wiki

EA forum
link

Intelligent
agents forum

Technical discussion
on AI safety
link

Discussion in
comments
IEET
Futureoflife