Jump to content

Technological singularity

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 71.205.46.60 (talk) at 18:19, 30 July 2006 (→‎Criticism). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

File:PPTParadigmShiftsFrr15Events.jpg
When plotted on a logarithmic graph, 15 separate lists of paradigm shifts for key events in human history show an exponential trend. Lists prepared by, among others, Carl Sagan, Paul D. Boyer, Encyclopædia Britannica, American Museum of Natural History and University of Arizona, compiled by Ray Kurzweil.

In futures studies, a technological singularity (often the Singularity) is a predicted future event believed to precede immense technological progress in an unprecedentedly brief time. Futurists give varying predictions as to the extent of this progress, the speed at which it occurs, and the exact cause and nature of the event itself.

One school of thought centers around the writings of I. J. Good, who predicted an "intelligence explosion" following the creation of the first mind smarter than a human mind, small improvements in intelligence are used to develop larger improvements, which allow for even larger improvements, ad infinitum. Vernor Vinge later popularized the concept in the 1980s with lectures, essays, and science fiction. Vinge argued that the Singularity would occur following creation of strong artificial intelligence or sufficiently advanced intelligence amplification technologies such as brain-computer interfaces.

Another school, promoted heavily by Ray Kurzweil, claims that technological progress follows a pattern of exponential (or super-exponential) growth, suggesting rapid technological change in 21st century. Kurzweil considers the advent of superhuman intelligence to be part of an overall exponential trend in human technological development seen originally in Moore's Law and extrapolated into a general trend in Kurzweil's own Law of Accelerating Returns. Unlike a hyperbolic function, Kurzweil's predicted exponential model never experiences a true mathematical singularity.[1]

While some regard the Singularity as a positive event and work to hasten its arrival, others view it as dangerous, undesirable, or unlikely to occur. The most practical means for initiating the Singularity are debated, as are how (or whether) it can be influenced or avoided if dangerous.

The Singularity is also frequently depicted in science fiction.

Intelligence explosion

I. J. Good (1965) writes:

"Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an 'intelligence explosion,' and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make."

Mathematician and author Vernor Vinge greatly popularized Good's notion of an intelligence explosion in the 1980s, calling it the Singularity. Vinge first addressed the topic in print in the January 1983 issue of Omni Magazine. He later collected his thoughts in the 1993 essay "The Coming Technological Singularity", which contains the oft-quoted statement "Within thirty years, we will have the technological means to create superhuman intelligence. Shortly thereafter, the human era will be ended."

Vinge writes that superhuman intelligences, however created, will be even more able to enhance their own minds faster than the humans that created them. "When greater-than-human intelligence drives progress," Vinge writes, "that progress will be much more rapid." This feedback loop of self-improving intelligence, he predicts, will cause large amounts of technological progress within a short period of time.


Most proposed methods for creating smarter-than-human or transhuman minds fall into one of two categories: intelligence amplification of human brains and artificial intelligence. The means speculated to produce intelligence augmentation are numerous, and include bio- and genetic engineering, nootropic drugs, AI assistants, direct brain-computer interfaces, and mind transfer. Despite the numerous speculated means for amplifying human intelligence, non-human artificial intelligence (specifically seed AI) is the most popular option for organizations trying to directly initiate the Singularity, a choice the Singularity Institute addresses in its publication "Why Artificial Intelligence?" (2005).

George Dyson speculates in Darwin Among the Machines that a sufficiently complex computer network may produce "swarm intelligence", and that improved future computing resources may allow AI researchers to create artificial neural networks so large and powerful they become generally intelligent.

Potential dangers

Some speculate superhuman intelligences may have goals inconsistent with human survival and prosperity. AI researcher Hugo de Garis suggests AIs may simply eliminate the human race, and humans would be powerless to stop them. Other oft-cited dangers include molecular nanotechnology and genetic engineering. These threats are major issues for both Singularity advocates and critics, and were the subject of a Wired Magazine article by Bill Joy, Why the future doesn't need us (2000). Oxford philosopher Nick Bostrom summarizes the potential threats of the Singularity to human survival in his essay Existential Risks (2002).

Many Singularitarians consider nanotechnology to be one of the greatest dangers facing humanity. For this reason, they often believe seed AI should precede nanotechnology. Others, such as the Foresight Institute, advocate efforts to create molecular nanotechnology, claiming nanotechnology can be made safe for pre-Singularity use or can expedite the arrival of a beneficial Singularity.

Advocates of Friendly Artificial Intelligence acknowledge the Singularity is potentially very dangerous and work to make it safer by creating AI that will act benevolently towards humans and eliminate existential risks. This idea is also embodied in Isaac Asimov's Three Laws of Robotics, intended to prevent artificially intelligent robots from harming humans, though the crux of Asimov's stories is often how the laws fail.

Accelerating change

A logarithmic timeline showing an exponentially accelerating trend towards increasing frequency of major events (as chosen by Theodore Modis) in human and natural history. Some critics of Kurzweil's theory dispute the choice of such specific events.

Some proponents of the Singularity argue its inevitability through extrapolation of past trends, especially those pertaining to shortening gaps between improvements to technology. In one of the first uses (if not the first use) of the term singularity in the context of technological progress, Stanislaw Ulam cites accelerating change:

"One conversation centered on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue." —May 1958, referring to a conversation with John von Neumann

In his book "Mindsteps to the Cosmos" (HarperCollins, August 1983), Gerald S. Hawkins elucidated his notion of 'mindsteps', dramatic and irreversible changes to paradigms or world views. He identified five distinct mindsteps in human history, and the technology that accompanied these "new world views": the invention of imagery, writing, mathematics, printing, the telescope, rocket, computer, radio, TV... "Each one takes the collective mind closer to reality, one stage further along in its understanding of the relation of humans to the cosmos." He noted: "The waiting period between the mindsteps is getting shorter. One can't help noticing the acceleration." Hawkins' empirical 'mindstep equation' quantified this, and gave dates for future mindsteps. The date of next mindstep (5; the series begins at 0) is given as 2021, with two more successively closer mindsteps, until the limit of the series in 2053. His speculations ventured beyond the technological:

"The mindsteps... appear to have certain things in common - a new and unfolding human perspective, related inventions in the area of memes and communications, and a long formulative waiting period before the next mindstep comes along. None of the mindsteps can be said to have been truly anticipated, and most were resisted at the early stages. In looking to the future we may equally be caught unawares. We may have to grapple with the presently inconceivable, with mind-stretching discoveries and concepts."

Since the late 1970s, others like Alvin Toffler (author of Future Shock), Daniel Bell and John Naisbitt have approached the theories of postindustrial societies similar to visions of near- and post-Singularity societies. They argue the industrial era is coming to an end, and services and information are supplanting industry and goods. Some more extreme visions of the postindustrial society, especially in fiction, envision the elimination of economic scarcity.

Many sociologists and anthropologists have created social theories of sociocultural evolution. Some, like Lewis H. Morgan, Leslie White, and Gerhard Lenski, declare technological progress to be the primary factor driving the development of human civilization. Morgan's three major stages of social evolution can be divided by technological milestones. Instead of specific inventions, White decided that the measure by which to judge the evolution of culture is its control of energy, which he describes as "the primary function of culture." His model eventually led to the creation of the Kardashev scale. Lenski takes a more modern approach and declares the more information a given society has, the more advanced it is.

Kurzweil's law of accelerating returns

Kurzweil writes that, due to paradigm shifts, the trend of exponential growth holds true from integrated circuits to earlier transistors, vacuum tubes, relays and electromechanical computers.

Ray Kurzweil justifies his belief in an imminent singularity by an analysis of history from which he concludes that technological progress follows a pattern of exponential growth. He calls this conclusion The Law of Accelerating Returns. He generalizes Moore's law, which describes exponential growth in integrated semiconductor complexity, to include technologies from far before the integrated circuit.

Whenever technology approaches a barrier, he writes, new technologies will cross it. He predicts paradigm shifts will become increasingly common, leading to "technological change so rapid and profound it represents a rupture in the fabric of human history" (Kurzweil 2001). Kurzweil believes the Singularity will occur before the end of the 21st century, setting the date at 2048 (Kurzweil 2005). His predictions differ from Vinge's in that he predicts a gradual ascent to the Singularity, rather than Vinge's rapidly self-improving superhuman intelligence. The distinction is often made with the terms soft and hard takeoff.

Criticism

Theodore Modis and Jonathan Huebner have argued, from different perspectives, that the rate of technological innovation has not only ceased to rise, but is actually now declining. John Smart has criticized their conclusions. Others criticize Kurzweil's choices of specific past events to support his theory.

An article in The Economist spoofs projections of technological trends by suggesting that the number of razor blades on disposable safety razors may rise to infinity by 2015 [1].

In fact, "technological singularity" is just one of a few singularities detected through the analysis of a number of characteristics of the World System development, for example, with respect to the world population, world GDP, and some other economic indexes (e.g., Johansen, A., and D. Sornette. 2001. Finite-time Singularity in the Dynamics of the World Population and Economic Indices. Physica A 294(3–4): 465–502). It has been shown (e.g., Korotayev A., Malkov A., Khaltourina D. Introduction to Social Macrodynamics: Secular Cycles and Millennial Trends. Moscow: URSS, 2006) that the hyperbolic pattern of the world population and technology growth (observed for many centuries, if not millennia prior to the 1970s) could be accounted for by a rather simple mechanism, the nonlinear second order positive feedback, that was shown long ago to generate precisely the hyperbolic growth, known also as the "blow-up regime" (implying just finite-time singularities). In our case this nonlinear second order positive feedback looks as follows: more people – more potential inventors – faster technological growth – the carrying capacity of the Earth grows faster – faster population growth – more people – more potential inventors – faster technological growth, and so on. On the other hand, this research has shown that since the 1970s the World System does not develop hyperbolically any more, its development diverges more and more from the blow-up regime, and at present it is moving "from singularity", rather than "toward singularity".

Jürgen Schmidhuber calls the Singularity Omega, referring to Teilhard de Chardin's Omega point (1916). For Omega = 2040 he says the series Omega - 2^n human lifetimes (n<10; one lifetime = 80 years) roughly matches the most important events in human history. But he also questions the validity of such lists, suggesting they just reflect a general rule for "both the individual memory of single humans and the collective memory of entire societies and their history books: constant amounts of memory space get allocated to exponentially larger, adjacent time intervals further and further into the past." He suggests that this may be the reason "why there has never been a shortage of prophets predicting that the end is near - the important events according to one's own view of the past always seem to accelerate exponentially."

Potential dangers and neo-Luddism

Some argue advanced technologies are simply too dangerous for humans to morally allow them to be built, and advocate efforts to stop their invention. Theodore Kaczynski, the Unabomber, writes that technology may enable the upper classes of society to "simply decide to exterminate the mass of humanity." Alternatively, if AI is not created, Kaczynski argues that humans "will have been reduced to the status of domestic animals" after sufficient technological progress. Portions of Kaczynski's writings have been included in both Bill Joy's article and in a recent book by Ray Kurzweil. However, Kaczynski not only opposes the Singularity but also supports neo-luddism. Many people oppose the Singularity without opposing present-day technology as Luddites do.

Along with Kaczynski, many other anti-civilization theorists, such as John Zerzan and Derrick Jensen, represent the school of anarcho-primitivism or eco-anarchism, which sees the rise of the technological singularity as an orgy of machine control, and a loss of a feral, wild, and uncompromisingly free existence outside of the factory of domestication (civilization). In essence, environmental groups such as the Earth Liberation Front and Earth First! see the singularity as a force to be resisted at all costs. Author and social change strategist James John Bell has written articles for Earth First! as well as mainstream science and technology publications, like The Futurist, providing a cautionary environmentalist perspective on the singularity, including his essays Exploring The "Singularity" and Technotopia and the Death of Nature: Clones, Supercomputers, and Robots. Also, the publication Green Anarchy, to which Kaczynski and Zerzan are regular contributors, has published articles about resistance to the technological singularity, e.g. A Singular Rapture, written by MOSH (which is in reference to Kurzweil's M.O.S.H., for "Mostly Original Substrate Human").

Just as Luddites opposed artifacts of the industrial revolution, due to concern for their effects on employment, some opponents of the Singularity are concerned about future employment opportunities. Although Luddite fears about jobs were not realised, given the growth in jobs after the industrial revolution, there was an effect on involuntary employment: a dramatic decrease in child labor and the labors of the overaged. It can be argued that only a drop in voluntary employment should be of concern, not a reduced level of absolute employment (such a position is held by Henry Hazlitt). Economically, a post-Singularity society would likely have more wealth than a pre-Singularity society (via increased knowledge of matter and energy manipulation to meet human needs). One possible post-Singularity future, therefore, is one in which per capita wealth increases dramatically while per capita employment decreases.

Fictional depictions

In addition to the Vernor Vinge stories that pioneered Singularity ideas, several other science fiction authors have written stories that involve the Singularity as a central theme. Notable authors include William Gibson, Charles Stross, Karl Schroeder, Greg Egan, David Brin, Iain M. Banks, Neal Stephenson, Bruce Sterling, Damien Broderick, Fredric Brown, and Jacek Dukaj. Ken MacLeod describes the Singularity as "the Rapture for nerds" in his 1998 novel The Cassini Division. Singularity themes are common in cyberpunk novels, such as the recursively self-improving AI Neuromancer from William Gibson's novel of the same name. A 1994 novel published on Kuro5hin called The Metamorphosis of Prime Intellect depicts life after an AI-initiated Singularity. A more dystopian version is Harlan Ellison's short story I Have No Mouth and I Must Scream. Yet another example is Accelerando by Charles Stross.

University of Texas chemist Eamonn Healy provides his own take on the Singularity concept in the film Waking Life. He describes the acceleration of evolution by breaking it down into "two billion years for life, six million years for the hominid, a hundred-thousand years for mankind as we know it" then describes the acceleration of human cultural evolution as being ten thousand years for agriculture, four hundred years for the scientific revolution, and one hundred fifty years for the industrial revolution. He concludes we will eventually create "neohumans" which will usurp humanity's present role in scientific and technological progress and allow the exponential trend of accelerating change to continue past the limits of human ability.

Organizations and other prominent voices

The Singularity Institute for Artificial Intelligence is a 501(c)(3) nonprofit research institute for the study and advancement of beneficial AI. They are working to shape what statistician I.J. Good called the "intelligence explosion." They have the additional goal of fostering a broader discussion and understanding of Friendly Artificial Intelligence. They focus on Friendly AI, as they believe strong AI will enhance cognition before human cognition can be enhanced by neurotechnologies or somatic gene therapy. The Institute employs Tyler Emerson as executive director, Carolyn L Burke as director of communications, Allison Taguchi as director of development, AI researcher Eliezer Yudkowsky as a research fellow, Marcello Herreshoff as a research associate, and Michael Wilson as a research associate.

The Acceleration Studies Foundation (ASF), an educational nonprofit, was formed to attract broad scientific, technological, business, and social change interest in acceleration and evolutionary development studies. They produce Accelerating Change, an annual conference on multidisciplinary insights in accelerating technological change at Stanford University, and maintain Acceleration Watch [2], an educational site discussing accelerating technological change.

Other prominent voices:

Notes

See also

References

  • Broderick, D. (2001). The Spike: How Our Lives Are Being Transformed by Rapidly Advancing Technologies. New York: Forge. ISBN 0312877811.
  • Bostrom, N. (2003). "Ethical Issues in Advanced Artificial Intelligence". Cognitive, Emotive and Ethical Aspects of Decision Making in Humans and in Artificial Intelligence. 2: 12–17.
  • Bostrom, N. (2002). "Existential Risks". Journal of Evolution and Technology. 9.
  • Good, I. J. (1965). "Speculations Concerning the First Ultraintelligent Machine", in Advances in Computers, vol 6, Franz L. Alt and Morris Rubinoff, eds, pp31-88, 1965, Academic Press.
  • Joy, B. (April 2000). "Why the future doesn't need us". Wired Magazine. 8.04.
  • Kurzweil, R. (2001). "The Law of Accelerating Returns". {{cite journal}}: Cite journal requires |journal= (help)
  • Kurzweil, R. (2005). The Singularity is Near. New York: Viking. ISBN 0670033847.
  • Singularity Institute for Artificial Intelligence, Inc. (2005). "Why Artificial Intelligence?". Retrieved February 18. {{cite web}}: Check date values in: |accessdate= (help); Unknown parameter |accessyear= ignored (|access-date= suggested) (help)
  • Ulam, S. (1958). "Tribute to John von Neumann", Bulletin of the American Mathematical Society, vol 64, nr 3, part 2, May 1958, pp1-49.
  • Vinge, V. (1993). "The Coming Technological Singularity". {{cite journal}}: Cite journal requires |journal= (help)

External links

Essays

Singularity AI projects

Portals and wikis

Other links

Template:Link FA Template:Sustainability and energy development group