The Singularity

From H+Pedia
Revision as of 09:06, 30 January 2016 by Deku-shrub (talk | contribs)
Jump to: navigation, search

The Singularity (more fully, the "Technological Singularity") is an envisaged future time period when artificial intelligence becomes more generally capable than humans. As a result of that development, an "intelligence explosion" might follow, as first described in 1964 by I.J. Good [1]:

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultra-intelligent machine could design even better machines; there would then unquestionably be an "intelligence explosion," and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make, provided that the machine is docile enough to tell us how to keep it under control. It is curious that this point is made so seldom outside of science fiction. It is sometimes worthwhile to take science fiction seriously.

This notion of "Technological Singularity" is often conflated with some related ideas:

There is further scope for confusion from the fact that the Singularity University avoids any focus on the Technological Singularity, but instead focuses on the increasing pace of technological development.

This article focuses on the primary definition given above, since that has a unique meaning.

If it takes place, the potential outcomes of the Technological Singularity range from extremely good to extremely bad.

External links

References