Professional Documents
Culture Documents
By Kevin Mordovanec
Existential Risk
● Anything that threatens humanity’s survival
● Climate change, overpopulation, nuclear war
● Most important problem ever faced
● You know about the ones mentioned
● Not as talked about: AI
Introduction to the Singularity
● What is AI?
● Moore’s Law
● AI will get smarter than us
● It’ll hold godlike power
● 42% experts agree (by 2030)
● 90% probability by 2075
Introduction to Terms
● Artificial Intelligence (AI)
○ A human-made program made to a complete a task
● Artificial General Intelligence (AGI)
○ AI with problem-solving capabilities equal to or superior to a single human
● Artificial Superintelligence (ASI)
○ AI with problem-solving capabilities several orders of magnitude beyond a single human
● Singularity
○ The set of societal changes that will happen due to the emergence of ASI
● Friendly Artificial Intelligence (FAI)
○ AI that has goals which align with those of humans
● Unfriendly Artificial Intelligence (UFAI)
○ AI that has goals which conflict with those of humans and are most likely actively harmful
Paperclip Maximizers
● AI acts only toward preprogrammed goals
● Imagine AI that makes paperclips
● Kills all life, turns into paperclips
● 31% experts agree on bad outcome
● Building with 31% chance of fire
Other Possible Disasters
● Maximize happiness: convert everything to hedonium
● Solve problems, but not implement: computronium
Fictional Scenarios
● Avengers: Age of Ultron
● I, Robot
● Failed love utopia
● Friendship is Optimal