Escolar Documentos
Profissional Documentos
Cultura Documentos
of the
Intelligence Explosion
Eliezer Yudkowsky
Singularity Institute for Artificial Intelligence
singinst.org
"Intelligence explosion:"
• Concept invented by I. J. Good (famous
name in Bayesian statistics) in 1965.
• Hypothesis: The smarter you are, the
more creativity you can apply to the task
of making yourself even smarter.
• Prediction: Positive feedback cycle rapidly
leading to superintelligence.
(Good, I. J. 1965. Speculations Concerning the First Ultraintelligent Machine.
Pp. 31-88 in Advances in Computers, 6, F. L. Alt and M. Rubinoff, eds. New
York: Academic Press.)
Eliezer Yudkowsky Singularity Institute for AI
Intelligence explosion hypothesis
does not imply, nor require:
• More change occurred from 1970 to 2000
than from 1940 to 1970.
• Technological progress follows a
predictable curve.
• Does not even require that "Real AI" is
possible! (An intelligence explosion could
happen with augmented humans.)
Chimp Einstein
Eliezer Yudkowsky Singularity Institute for AI
The power of intelligence:
• Fire
• Language
• Nuclear weapons
• Skyscrapers
• Spaceships
• Money
• Science
• Go to http://singinst.org/
(Or google "Singularity Institute")
• Click on "Summit Notes"
Chimp Einstein
vs.
Bees Hunter-gatherers
Chimps Internet
• Therefore we should
not build AI.
• Therefore, build AI.
Freepy AIs
Gloopy AIs
"I have often used this edict with groups I have led -
particularly when they face a very tough problem,
which is when group members are most apt to
propose solutions immediately."
-- Robyn Dawes