Technological singularity

The technological singularity—or simply the singularity[1]—is a hypothetical point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable consequences for human civilization.[2][3] According to the most popular version of the singularity hypothesis, I. J. Good's intelligence explosion model of 1965, an upgradable intelligent agent could eventually enter a positive feedback loop of successive self-improvement cycles; more intelligent generations would appear more and more rapidly, causing a rapid increase ("explosion") in intelligence that culminates in a powerful superintelligence, far surpassing all human intelligence.[4]

Some scientists, including Stephen Hawking, have expressed concern that artificial superintelligence (ASI) could result in human extinction.[5][6] The consequences of a technological singularity and its potential benefit or harm to the human race have been intensely debated.

Prominent technologists and academics dispute the plausibility of a technological singularity and associated artificial intelligence explosion, including Paul Allen,[7] Jeff Hawkins,[8] John Holland, Jaron Lanier, Steven Pinker,[8] Theodore Modis,[9] Gordon Moore,[8] and Roger Penrose.[10] One claim is that artificial intelligence growth is likely to run into decreasing returns instead of accelerating ones. Stuart J. Russell and Peter Norvig observe that in the history of technology, improvement in a particular area tends to follow an S curve: it begins with accelerating improvement, but it eventually begins to level off (without continuing upward into a hyperbolic singularity).[11] Consider, for example, the history of transportation, which experienced exponential improvement from 1820 to 1970, but then abruptly leveled off. Predictions based on continued exponential improvement (e.g. interplanetary travel by 2000) proved to be false.

  1. ^ Cadwalladr, Carole (22 February 2014). "Are the robots about to rise? Google's new director of engineering thinks so…". The Guardian. Retrieved 8 May 2022.
  2. ^ "Collection of sources defining "singularity"". singularitysymposium.com. Archived from the original on 17 April 2019. Retrieved 17 April 2019.
  3. ^ Eden, Amnon H.; Moor, James H.; Søraker, Johnny H.; Steinhart, Eric, eds. (2012). Singularity Hypotheses: A Scientific and Philosophical Assessment. The Frontiers Collection. Dordrecht: Springer. pp. 1–2. doi:10.1007/978-3-642-32560-1. ISBN 9783642325601.
  4. ^ Vinge, Vernor. "The Coming Technological Singularity: How to Survive in the Post-Human Era" Archived 2018-04-10 at the Wayback Machine, in Vision-21: Interdisciplinary Science and Engineering in the Era of Cyberspace, G. A. Landis, ed., NASA Publication CP-10129, pp. 11–22, 1993. - "There may be developed computers that are "awake" and superhumanly intelligent. (To date, there has been much controversy as to whether we can create human equivalence in a machine. But if the answer is 'yes, we can', then there is little doubt that beings more intelligent can be constructed shortly thereafter.)"
  5. ^ Sparkes, Matthew (13 January 2015). "Top scientists call for caution over artificial intelligence". The Telegraph (UK). Archived from the original on 7 April 2015. Retrieved 24 April 2015.
  6. ^ "Hawking: AI could end human race". BBC. 2 December 2014. Archived from the original on 30 October 2015. Retrieved 11 November 2017.
  7. ^ Cite error: The named reference Allen2011 was invoked but never defined (see the help page).
  8. ^ a b c Cite error: The named reference ieee-lumi was invoked but never defined (see the help page).
  9. ^ Cite error: The named reference modis2012 was invoked but never defined (see the help page).
  10. ^ Penrose, Roger (1999). The emperor's new mind: concerning computers, minds and the laws of physics. Oxford: Oxford Univ. Press. ISBN 978-0-19-286198-6.
  11. ^ Russell, Stuart J.; Norvig, Peter (2021). Artificial Intelligence: A Modern Approach (4th ed.). Hoboken: Pearson. p. 1005. ISBN 978-0-1346-1099-3. LCCN 20190474.

© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search