Difference between revisions of "The Singularity"

From H+Pedia
Jump to: navigation, search
(Created page with "The Singularity (more fully, the "Technological Singularity") is an envisaged future time period when artificial intelligence becomes more generally capable than humans. As a...")
 
Line 17: Line 17:
 
* [http://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0199678111/ Superintelligence: Paths, Dangers, Strategies, by Nick Bostrom]
 
* [http://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0199678111/ Superintelligence: Paths, Dangers, Strategies, by Nick Bostrom]
 
* [http://www.amazon.com/Our-Final-Invention-Artificial-Intelligence/dp/1250058783/ Our Final Invention: Artificial Intelligence and the End of the Human Era, by James Barrat]
 
* [http://www.amazon.com/Our-Final-Invention-Artificial-Intelligence/dp/1250058783/ Our Final Invention: Artificial Intelligence and the End of the Human Era, by James Barrat]
 +
* [https://www.youtube.com/watch?v=GYQrNfSmQ0M The Long-term Future of (Artificial) Intelligence (video) by Stuart Russell]

Revision as of 04:32, 17 July 2015

The Singularity (more fully, the "Technological Singularity") is an envisaged future time period when artificial intelligence becomes more generally capable than humans. As a result of that development, an "intelligence explosion" might follow, as first described in 1964 by I.J. Good [1]:

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultra-intelligent machine could design even better machines; there would then unquestionably be an "intelligence explosion," and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make, provided that the machine is docile enough to tell us how to keep it under control. It is curious that this point is made so seldom outside of science fiction. It is sometimes worthwhile to take science fiction seriously.

This notion of "Technological Singularity" is often conflated with some related ideas:

  • The merger of biology and technology
  • A period of time in which development happens exceedingly rapidly
  • A period of time beyond which it is impossible to foresee further developments

This article, however, focuses on the primary definition given above, since that has a unique meaning.

If it takes place, the potential outcomes of the Technological Singularity range from extremely good to extremely bad.

References