Você está na página 1de 35

The Human Importance

of the
Intelligence Explosion

Eliezer Yudkowsky
Singularity Institute for Artificial Intelligence
singinst.org
"Intelligence explosion:"
• Concept invented by I. J. Good (famous
name in Bayesian statistics) in 1965.
• Hypothesis: The smarter you are, the
more creativity you can apply to the task
of making yourself even smarter.
• Prediction: Positive feedback cycle rapidly
leading to superintelligence.
(Good, I. J. 1965. Speculations Concerning the First Ultraintelligent Machine.
Pp. 31-88 in Advances in Computers, 6, F. L. Alt and M. Rubinoff, eds. New
York: Academic Press.)
Eliezer Yudkowsky Singularity Institute for AI
Intelligence explosion hypothesis
does not imply, nor require:
• More change occurred from 1970 to 2000
than from 1940 to 1970.
• Technological progress follows a
predictable curve.
• Does not even require that "Real AI" is
possible! (An intelligence explosion could
happen with augmented humans.)

Eliezer Yudkowsky Singularity Institute for AI


"Book smarts" vs. cognition:
"Book smarts" evokes Other stuff that happens
images of: in the brain:

• Calculus • Social persuasion


• Chess • Enthusiasm
• Good recall of facts • Reading faces
• Rationality
• Strategic ability

Eliezer Yudkowsky Singularity Institute for AI


The scale of intelligent minds:
a parochial view.
Village idiot Einstein

Eliezer Yudkowsky Singularity Institute for AI


The scale of intelligent minds:
a parochial view.
Village idiot Einstein

A more cosmopolitan view:


Mouse Village idiot

Chimp Einstein
Eliezer Yudkowsky Singularity Institute for AI
The power of intelligence:
• Fire
• Language
• Nuclear weapons
• Skyscrapers
• Spaceships
• Money
• Science

Eliezer Yudkowsky Singularity Institute for AI


One of these things
is not like the other...

• Space travel • Artificial Intelligence


• Extended lifespans • Nanofactories

Eliezer Yudkowsky Singularity Institute for AI


Intelligence:
• The most powerful force in the known
universe - see effects every day
• Most confusing question in today's
science - ask ten scientists, get ten
answers
• Not complete mystery – huge library of
knowledge about mind / brain / cognition –
but scattered across dozens of different
fields!
Eliezer Yudkowsky Singularity Institute for AI
If I am ignorant about a
phenomenon,
this is a fact about my state of
mind,
not a fact about the
phenomenon.

Confusion exists in the mind,


not in reality.
There are mysterious questions.
Never
Eliezer Yudkowsky
mysterious answers.
Singularity Institute for AI
For more about intelligence:

• Go to http://singinst.org/
(Or google "Singularity Institute")
• Click on "Summit Notes"

• Lecture video, book chapters

Eliezer Yudkowsky Singularity Institute for AI


The brain's biological bottleneck:
• Neurons run at 100Hz
• No read access
• No write access
• No new neurons
• Existing code not human-readable

Eliezer Yudkowsky Singularity Institute for AI


Relative difficulty:
• Build a Boeing • Starting with a bird,
747 from scratch. • Modify the design to
create a 747-sized bird,
• That actually flies,
• As fast as a 747,
• Then migrate actual living
bird to new design,
• Without killing the bird or
making it very unhappy.

Eliezer Yudkowsky Singularity Institute for AI


The AI Advantage
(for self-improvement)

• Total read/write access to own state


• Absorb more hardware
(possibly orders of magnitude more!)
• Understandable code
• Modular design
• Clean internal environment

Eliezer Yudkowsky Singularity Institute for AI


Biological bottleneck
(for serial speed)

• Lightspeed >106 times faster than axons,


dendrites.
• Synaptic spike dissipates >106 minimum
heat (though transistors do worse)
• Transistor clock speed >>106 times faster
than neuron spiking frequency

Eliezer Yudkowsky Singularity Institute for AI


Mouse Village idiot

Chimp Einstein

• Physically possible to build brain at least


1,000,000 times as fast as human brain
• Even without shrinking brain, lowering
temperature, quantum computing, etc...
• Drexler's Nanosystems says sensorimotor
speedup of >>106 also possible
• 1 year → 31 seconds

Eliezer Yudkowsky Singularity Institute for AI


10,000 years to nanotech?
(for superintelligence)
• Solve chosen special case of protein folding
• Order custom proteins from online labs with
72-hour turnaround time
• Proteins self-assemble to primitive device
that takes acoustic instructions
• Use to build 2nd-stage nanotech, 3rd-stage
nanotech, etc.
• Total time: 10,000 years ~ 4 days
Eliezer Yudkowsky Singularity Institute for AI
Respect the power of creativity
and be careful what you call
"impossible".

Eliezer Yudkowsky Singularity Institute for AI


Nuclear, Space,
Hunter- Ancient Industrial Computer, Biotech,
gatherers Greeks Revolution Internet Revolutions

Agriculture Renaissance Electrical Molecular


Revolution nanotechnology

Eliezer Yudkowsky Singularity Institute for AI


Nuclear, Space,
Hunter- Ancient Industrial Computer, Biotech,
gatherers Greeks Revolution Internet Revolutions

Agriculture Renaissance Electrical Molecular


Revolution nanotechnology

vs.
Bees Hunter-gatherers

Chimps Internet

Eliezer Yudkowsky Singularity Institute for AI


Can an intelligence explosion
be avoided?
• Self-amplifying once it starts to tip over
• Very difficult to avoid in the long run
• But many possible short-term delays

• Argument: A human-level civilization


occupies an unstable state; will eventually
wander into a superintelligent region or an
extinct region.
Eliezer Yudkowsky Singularity Institute for AI
Fallacy of the Giant Cheesecake
• Major premise: A superintelligence could
create a mile-high cheesecake.
• Minor premise: Someone will create a
recursively self-improving AI.
• Conclusion: The future will be full of giant
cheesecakes.

Power does not imply motive.


Eliezer Yudkowsky Singularity Institute for AI
Fallacy of the Giant Cheesecake
• Major premise: A superintelligence could
create a mile-high cheesecake.
• Minor premise: Someone will create a
recursively self-improving AI.
• Conclusion: The future will be full of giant
cheesecakes.

Power does not imply motive.

Eliezer Yudkowsky Singularity Institute for AI


Spot the missing premise:
• A sufficiently powerful • A sufficiently powerful
AI could wipe out AI could develop new
humanity. medical technologies
and save millions of
lives.

• Therefore we should
not build AI.
• Therefore, build AI.

Eliezer Yudkowsky Singularity Institute for AI


Spot the missing premise:
• A sufficiently powerful • A sufficiently powerful
AI could wipe out AI could develop new
humanity. medical technologies
and save millions of
• [And the AI would lives.
decide to do so.]
• [And the AI would
• Therefore we should decide to do so.]
not build AI.
• Therefore, build AI.

Eliezer Yudkowsky Singularity Institute for AI


Design space of
minds-in-general
Bipping AIs

Freepy AIs

Gloopy AIs

All human minds

Eliezer Yudkowsky Singularity Institute for AI


AI isn't a prediction problem,
it's an engineering problem.

We have to reach into mind design


space, and pull out a mind such that
we're glad we created it...

Eliezer Yudkowsky Singularity Institute for AI


AI isn't a prediction problem,
it's an engineering problem.

We have to reach into mind design


space, and pull out a mind such that
we're glad we created it...

Challenge is difficult and technical!

Eliezer Yudkowsky Singularity Institute for AI


"Do not propose solutions until the problem has been
discussed as thoroughly as possible without
suggesting any."
-- Norman R. F. Maier

"I have often used this edict with groups I have led -
particularly when they face a very tough problem,
which is when group members are most apt to
propose solutions immediately."
-- Robyn Dawes

(Dawes, R.M. 1988. Rational Choice in an Uncertain World.


San Diego, CA: Harcourt, Brace, Jovanovich.)

Eliezer Yudkowsky Singularity Institute for AI


What kind of AI do we want to see?

Much easier to describe AIs


we don't want to see...

Eliezer Yudkowsky Singularity Institute for AI


"Friendly AI"...

(the challenge of creating an AI


that, e.g., cures cancer, rather
than wiping out humanity)

...looks possible but very difficult.

Eliezer Yudkowsky Singularity Institute for AI


The intelligence explosion:
Enough power to...
make the world a better place?

Eliezer Yudkowsky Singularity Institute for AI


Someday,
the human species
has to grow up.
Why not sooner
rather than later?

Eliezer Yudkowsky Singularity Institute for AI


In a hundred million years,
no one's going to care
who won the World Series,
but they'll remember the first AI.

Eliezer Yudkowsky Singularity Institute for AI


For more information, please
visit the Singularity Institute at
http://singinst.org/

Você também pode gostar