Professional Documents
Culture Documents
Known
very well
and large
amount of work
is done
MIRI
Still exist!
Were famous in 1970s when they produced Limits of growth Link
OpenAI
Oxford Martin
Programme
CSER
Foundational
research
institute
Phil Torrres
Skoll Global
Threats Fund
Global
challenges
X-risks
net
Leveraged
research
Alexei Turchin
Creating full database on
x-risks and prevention plan
Elon Musk
Want AI safety through Open
AI and human on Mars as a
backup plan
link
Currently, our research focuses on reducing risks of dystopian futures in the context of
emerging technologies.
Interesting work on AI safety
Convergence
Lifeboat
foundation
Justin Shovelain
Collective think tank concentrated on mathematical modeling of
x-risks
link
Stimson
Center
Saving
Humanity
Public
figures
Arctic news
Sam Carana
The Lawrence
Livermore National
Laboratory
has a division called the Global Security Principal Directorate which researches on behalf of the government
issues such as bio-security, counter-terrorism, etc. Link
Impact
risks
Nano
risks
Diffusing nuclear
threat
NASA
Foresight
institute
link
link
link
Zoltvan
Istavn
Stephen
Hawking
Warned about risks of
aliens and AI
Writers
Ploughshares
Fund
Bill Gates
investor in x-related
projects, wiki
Invested
in MIRI
flutrackers.com
Jaan Tallinn
Peter Thiel
Bill Joy
Wrote famous article but now seems to
lost interest
World Health
Organization
(WHO)
Sam Altman
Y combinator,
Confounded
Open AI
International
panel of climate
change
Nuclear
threat initiative link
CISAC
link
and important
figures
GCRI
Famous doomsday
clock
link
Bio-risks
IPCC
link
Laszlo Szombatfalvy
Investors
X-risks
institute
Bulletin of
atomic
scientists
Club of Rome
Elon Musk
wiki
Nuclear
Effective altruism
EA forum
FHI
FLI
EA
Large and
interesting
research
General x-risks
AI risks
Global
warming
Presidential candidate
from transhumanist party
Wrote about x-risks
Vernor Vinge
Greg Igen
writer
Permutation
city
writer, created
Singularity idea
David Brin
John Barnes
writer,
Existence
Mother of
strorms
Holocen
impact working
group
Estimate risks of recent impacts
link
Scientists
and researchers
Open
places for
discussion
A. Sandberg
Adrian
Kent
Participated in
FHI and co-authored papers
LHC risks
Tobi Ord
site
existential
hope
Milan Circovic
Stevenson probe,
Anthropic shadow
Fermi paradox
Site
Bruce Tonn
Editor and writer
link
Max Tegmark
Wrote articles together
with Bostrom
Norvegian
transhumanists
Lesswrong
Robin
Hanson
Blog
Societal collapse
risks
Katja Grace
Willard Wells
R. Freitas
Fermi paradox
and DA
blog
AI impacts
Nanotech risks
X-risks
on
Reddit
Existentiarisk
Control problem
David
Denkenberger
agricultural risks
Dennis
Medows
Alexander
Kononov
Coined term
indestructibility of
civilization
R.Carrigan
Aaron Dar
Bill Napier
Risks of SETI
Risks of
supernovas
Risks of dark
comets
Longecity
subforum
Wikiresources
link
LW-wiki
EA forum
link
Intelligent
agents forum
Technical discussion
on AI safety
link
Discussion in
comments
IEET
Futureoflife