The Singularity

From H+Pedia
Revision as of 15:13, 20 August 2015 by Davidwwood (talk | contribs) (References)
Jump to: navigation, search

The Singularity (more fully, the "Technological Singularity") is an envisaged future time period when artificial intelligence becomes more generally capable than humans. As a result of that development, an "intelligence explosion" might follow, as first described in 1964 by I.J. Good [1]:

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultra-intelligent machine could design even better machines; there would then unquestionably be an "intelligence explosion," and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make, provided that the machine is docile enough to tell us how to keep it under control. It is curious that this point is made so seldom outside of science fiction. It is sometimes worthwhile to take science fiction seriously.

This notion of "Technological Singularity" is often conflated with some related ideas:

  • The merger of biology and technology
  • The transcedence of biology by humans (as featured on the cover of the book The Singularity Is Near by Ray Kurzweil)
  • A period of time in which development happens exceedingly rapidly
  • A period of time beyond which it is impossible to foresee further developments.

There is further scope for confusion from the fact that the Singularity University avoids any focus on the Technological Singularity, but instead focuses on the increasing pace of technological development.

This article (once written), however, focuses on the primary definition given above, since that has a unique meaning.

If it takes place, the potential outcomes of the Technological Singularity range from extremely good to extremely bad.