Three Major Singularity Schools


Originally appeared on the Machine Intelligence Research Institute blog, September 2007.)

Singularity discussions seem to be splitting up into three major schools of thought: Accelerating Change, the Event Horizon, and the Intelligence Explosion.

  • Accelerating Change:
    • Core claim: Our intuitions about change are linear; we expect roughly as much change as has occurred in the past over our own lifetimes. But technological change feeds on itself, and therefore accelerates. Change today is faster than it was 500 years ago, which in turn is faster than it was 5000 years ago. Our recent past is not a reliable guide to how much change we should expect in the future.
    • Strong claim: Technological change follows smooth curves, typically exponential. Therefore we can predict with fair precision when new technologies will arrive, and when they will cross key thresholds, like the creation of Artificial Intelligence.
    • Advocates: Ray Kurzweil, Alvin Toffler(?), John Smart
  • Event Horizon:
    • Core claim: For the last hundred thousand years, humans have been the smartest intelligences on the planet. All our social and technological progress was produced by human brains. Shortly, technology will advance to the point of improving on human intelligence (brain-computer interfaces, Artificial Intelligence). This will create a future that is weirder by far than most science fiction, a difference-in-kind that goes beyond amazing shiny gadgets.
    • Strong claim: To know what a superhuman intelligence would do, you would have to be at least that smart yourself. To know where Deep Blue would play in a chess game, you must play at Deep Blue’s level. Thus the future after the creation of smarter-than-human intelligence is absolutely unpredictable.
    • Advocates: Vernor Vinge
  • Intelligence Explosion:
    • Core claim: Intelligence has always been the source of technology. If technology can significantly improve on human intelligence – create minds smarter than the smartest existing humans – then this closes the loop and creates a positive feedback cycle. What would humans with brain-computer interfaces do with their augmented intelligence? One good bet is that they’d design the next generation of brain-computer interfaces. Intelligence enhancement is a classic tipping point; the smarter you get, the more intelligence you can apply to making yourself even smarter.
    • Strong claim: This positive feedback cycle goes FOOM , like a chain of nuclear fissions gone critical – each intelligence improvement triggering an average of>1.000 further improvements of similar magnitude – though not necessarily on a smooth exponential pathway. Technological progress drops into the characteristic timescale of transistors (or super-transistors) rather than human neurons. The ascent rapidly surges upward and creates superintelligence (minds orders of magnitude more powerful than human) before it hits physical limits.
    • Advocates: I. J. Good, Eliezer Yudkowsky

The thing about these three logically distinct schools of Singularity thought is that, while all three core claims support each other, all three strong claims tend to contradict each other.

If you extrapolate our existing version of Moore’s Law past the point of smarter-than-human AI to make predictions about 2099, then you are contradicting both the strong version of the Event Horizon (which says you can’t make predictions because you’re trying to outguess a transhuman mind) and the strong version of the Intelligence Explosion (because progress will run faster once smarter-than-human minds and nanotechnology drop it into the speed phase of transistors).

I find it very annoying, therefore, when these three schools of thought are mashed up into Singularity paste. Clear thinking requires making distinctions.

But what is still more annoying is when someone reads a blog post about a newspaper article about the Singularity, comes away with none of the three interesting theses, and spontaneously reinvents the dreaded fourth meaning of the Singularity:

  • Apocalyptism: Hey, man, have you heard? There’s this bunch of, like, crazy nerds out there, who think that some kind of unspecified huge nerd thing is going to happen. What a bunch of wackos! It’s geek religion, man.

I’ve heard (many) other definitions of the Singularity attempted, but I usually find them to lack separate premises and conclusions. For example, the old Extropian FAQ used to define the “Singularity” as the Inflection Point, “the time when technological development will be at its fastest” and just before it starts slowing down. But what makes this an interesting point in history apart from its definition? What are the consequences of this assumption? To qualify as a school of thought or even a thesis, one needs an internal structure of argument, not just a definition.

If you’re wondering which of these is the original meaning of the term “Singularity”, it is the Event Horizon thesis of Vernor Vinge, who coined the word.


This document is ©2007 by Eliezer Yudkowsky and free under the Creative Commons Attribution-No Derivative Works 3.0 License for copying and distribution, so long as the work is attributed and the text is unaltered.

Eliezer Yudkowsky’s work is supported by the Machine Intelligence Research Institute .

If you think the world could use some more rationality, consider blogging this page.

Praise, condemnation, and feedback are always welcome . The web address of this page is http://eyudkowsky.wpengine.com/singularity/schools/ .