Artificial life

from Wikipedia, the free encyclopedia

Artificial life ( KL , or AL = artificial life ) is in the weak or moderate form the research of natural life systems , their processes and evolution through computer-aided simulation and in the radical or strong form the creation of artificial life through synthetic biology . The discipline AL was named in 1986 by the American biologist Christopher Langton . There are three approaches, soft , of software ; hard , of hardware ; and wet , from biochemistry .

Artificial life research program


The research is interdisciplinary and covers biochemistry, philosophy, physics, computer science, mathematics, engineering and others. Three research approaches are intertwined:

Wet artificial life synthesizes living subsystems of lower order (genes, proteins) and higher order (tissue, organs, organisms) biochemically and emerged from synthetic biology , genetic engineering and systems biology .

Soft artificial life depicts life in software simulations and hard artificial life as a robot . For hard and soft artificial life , natural organisms function like software systems and are programmed by IT engineers.

A precise definition of what life is from the point of view of AL - natural ( strong AL , autopoiesis ) or artificial ( weak AL , modeling) - is not seen as a condition. Systems biology is concerned with this, and phenomenology is outsourced in philosophy . Research results obtained naturally and in simulation are considered equivalent in the AL.


The analysis of self-regulating processes comes from Norbert Wiener (1948). Its cybernetics focuses more on the functions of a system than on its components. In 1953, the mathematician Alan Turing wrote a pioneering work on how the cellular inhomogeneity of a medium can lead to morphogenetic pattern formation ( Turing mechanism ). Turing's work was expanded theoretically as well as with numerous concrete examples for biological pattern formation by Alfred Gierer and Hans Meinhardt . The work of the AL research discipline is, albeit not so designated, the model of the cellular automaton presented by mathematician John von Neumann in 1966 , a calculation and representation method for problems of biological organization, self-reproduction and the evolution of complexity. The automata theory has been further developed and simplified in many ways. Another example is Conway's Game of Life (1970). In Stephen Wolfram's computer theory of cellular automata (1984) automata of different complexity were classified into four classes. The American theoretical biologist Christopher Langton laid the formal foundation stone for the new AL science discipline in 1986/87 and gave it its name. He defined AL as man-made life versus natural life. Later he attached importance to the fact that nature contains humans and that humans, including their artifacts, do not see themselves outside of it. AL should therefore not move away from the concept of biology, but aim to narrow the gap between the two.

Ethical issues

Artificial life reproduces existing forms of life ( life-as-we know-it ) and creates possible forms of life ( life-as-it-could-be ). Mark A. Bedau formulated three questions on artificial life. A number of 14 subordinate research tasks or open problems emerged from them

  1. What is the origin of life or how does life arise from non-life?
  2. What are the potentials and limitations of living systems?
  3. How does life relate to intelligence, culture, society and human art objects?

Limitations of computer simulations are discussed. Interactions between the components of living systems generate new information ( emergence ) that influences predictability. The challenge is the adaptability of populations of artificial intelligence , e.g. B. communicate with each other on the Internet, learn from each other and adapt to increasingly complex environmental changes evolutionarily. The fact that artificial life can no longer be seen as artificial, but just as real as natural life, remains a vision against the background of the challenges involved.

Relationship with artificial intelligence

AL overlaps with Artificial Intelligence , is seen as a subdiscipline of it or, conversely, as a superordinate discipline. AL needs to integrate AI insights, as cognition is a core property of natural life, not just humans. The different requirements for AL such as dynamic hierarchies, evolution , evolution of complexity , self-organization , self-replication , cognition and others are today in the three sub- disciplines wet, soft and hard artificial life, often incompletely or reductionistically realized only in individual aspects or a few combinations , e.g. B. the goal of self-reproduction with evolution and increasing evolutionary complexity. Such goals therefore remain visionary (see also superintelligence ).


Typical requirements from a larger list of topics from AL are:

Autonomy and self-organization

Autonomy is a central quality of life. AL systems therefore require some control over their own manufacture. This can be achieved through the process of self-organization. Living systems have autopoiesis , a limited network of properties that maintain their organization. A system that is autonomous in the strict sense enables the viewer to speak of him as an individual who acts according to intrinsic goals and is therefore a genuine agent. Autonomy is a prerequisite for self-organization . Self-organization is the ability of an (artificial) system to find its own form or function. Local interactions in the system lead to global system patterns or behaviors. Self-organization should be able to take place in an AL system on different hierarchical levels analogous to the biologist, where it can be found on the gene level, cell level or organismic level.

Self-reproduction and increasing complexity

Self- reproduction (or reproduction ) can take place by duplicating the data structures depending on definable conditions. A distinction must be made in Neumann's sense of the word: mere self-replication and adaptable self-reproduction . Only machines with the latter property theoretically enable the unlimited growth of complexity by means of a network of inheritable mutations. This is necessary for AL, since natural evolution also has unlimited increasing complexity in the long term. It must therefore be possible for AL to find a solution to the apparently paradoxical problem that machines produce more complex machines, albeit to a very limited extent from generation to generation. Von Neumann called such a machine a “general constructive automaton”. This is able to produce an infinitely complex machine from a simple machine in a theoretically infinitely long chain of reproductions.

The death of an individual is performed by deletion of the data structure depending on definable conditions. ( Sexual ) reproduction with a combination of the properties of the data structures of two individuals is also possible.


Adaptation is defined for an AL system as a change in an agent or system in response to a state of the environment. The change can help the system achieve its goal. A system that allows reproduction with increasing complexity (see above) is adaptable, but adaptation is more: It can take place on a slow scale (several generations) in the form of evolutionary adaptation , on a medium scale (one generation) as development or morphogenesis (e.g. autoregulatory models of limb development ) or on a fast scale (section of a generation) as learning . AL systems must therefore u. a. be able to learn. Evolution can be realized through variation in reproduction and selection in reproduction and death. The definition of conditions for these events creates selection pressure . Scientifically used programs with the evolutionary abilities of artificial life are z. B. Tierra developed in 1991 by Thomas S. Ray , the world's first digital medium with spontaneous evolution or its derivative Avida from Michigan State University .

Hierarchy and Emergence

Biological systems are structured hierarchically. They consist of lower and higher hierarchical levels such as genes, gene products (proteins), cells, cell associations (tissues), organs, organisms, societies. Technical systems are also organized hierarchically, for example from chips, circuit boards, processors, computers and the Internet. As a central requirement for AL, this must be able to explain how robust , dynamic multi-level hierarchies can arise solely from the interaction of the elements of the lowest levels. As an example, Conway's Game of Life can create multiple levels of hierarchy; In contrast to biological systems, however, these are not robust. Changing a single parameter or a few can cause the system to collapse. Dynamic means that the number of hierarchies in a system is not fixed, but can vary over the course of evolution.

Hierarchical systems can show emergence. So is z. B. an H 2 O molecule is not liquid, but many are at certain temperatures. A significant advance in the understanding of robust, dynamic hierarchies could be achieved if - analogous to biology - a model on the lowest level can generate objects through simple rules, from whose interactions on the second level different, emergent objects without external intervention (input) arise. Objects on the third level with new emergent properties then arise from the interactions of the objects on the second level, etc. Such models exist, for example, from Steen Rasmussen or as the aforementioned game of life. However, they are not robust. If such artificial systems evolve through adaptation, it means that their higher level properties are not designed, but arise unplanned through selection. Models of this kind today only have a limited, simple evolution, both in terms of their evolutionary potential and their hierarchical levels. Innovations and system transitions as known from biological evolution are not technically realized.

Networking and communication

Networking with information exchange between the simulated living beings and their simulated environment is an essential condition for AL systems in order to show emergence. Without a network infrastructure, the emergent behavior of a system would disappear quickly and transiently like fluctuations . Networks are a prerequisite for stabilizing emergent phenomena and for a system with increasing complexity to be able to achieve robustness . Networking and communication ultimately enable the formation of social structures in AL systems.

Cognitive abilities

Cognitive skills on lower levels, such as visual and auditory perception, can be used for simulated living beings with the help of artificial neural networks or other structures of artificial intelligence (AI) . Machine learning is one of them. Simple neural networks are passive. They react to sensory inputs and generate a clearly assignable output from it. For example, a moving system “sees” an obstacle and avoids it. Active networks, which are required for AL systems, especially when simulating higher levels of cognition, can do more. In this way, “ecological networks” can generate their own input, deal with different times (past, present and future) and predict the future. This can be done on the basis of the internal states that represent memory and by determining the next input from its own output. Such systems can learn from consequences by changing the internal states and their external environment themselves. The network is thus active compared to a homogeneous, classic, passive neural network. For example, an AL system learns that an object is fragile through experimentation. It remembers (saves) that the object is broken. When handling such an object repeatedly, it will recognize its equality or similarity with other objects and, as a consequence, will handle it carefully. The system has learned and the environment is no longer what it was when the object broke.

Language can be understood as the highest level of cognition. While classical AI analyzes the evolution of language as a purely symbolic handling, detached from the environment as a laboratory experiment, AL wants to analyze and simulate the evolution of language in an "ecological network" with real living conditions at the population level. This network should include the brain, body and the external environment.

It is possible to produce systems with the abilities described. However, the theoretical claim that an AL system evolves in an evolutionary way at the same time cognitive abilities on different levels, from simple perception, motor skills, memory, association and categorization to language, i.e. inherited with adaptations, is extremely high. In addition, the fundamental relationship between life and spirit with emergence from the biological material (e.g. consciousness , pain or other qualia ) is not yet understood today. This means that AL systems can simulate individual qualia according to current knowledge, but cannot own them themselves. The Turing test is a benchmark for developing cognitive skills, including language, in an AL system.


AL uses a number of special methods. Examples are:

Selected practical fields of application

Today there are applications for AL, among others in synthetic biology, in the health sector and medicine, in ecology, in autonomous robots, in the transport and traffic sector, in computer graphics, for virtual societies and in computer games.

Health system and medicine

The modern healthcare system in highly developed societies has developed into a complex system in which complex, intelligent AL applications are tested. The health sector or parts of it can be simulated in a specific virtual world . The aim here is to understand the system behavior and identify responses to tentative system changes. Possible financial aspects such as cost explosions in the health sector should e.g. B. be recognized in gradually more adaptable AL applications. Basically, the following applies here, as in other AL application areas: The more complex the inner workings of the agents and the more complex the stimulating system environment is designed, the more intelligence is required in order to gradually enable such systems to behave more and more realistic.

In medicine z. B. deformable organisms based on AL are used. Deformable organisms are autonomous agents whose job it is to automatically segment, label, and quantitatively analyze anatomical structures in medical images. Analogous to natural organisms that can make voluntary movements, artificial organisms of this type have deformable bodies with distributed sensors and rudimentary brains with centers for movement, perception, behavior and cognition.

Tissue engineering and regenerative medicine will increasingly benefit from the integration of AL methods. Computer-aided (neuronally connected) sensory prostheses are also counted as AL in individual cases; at least it is said that AL can contribute to better prosthetic techniques.

Environmental science and ecology

Ecological AL studies can be described as interactions between artificial individuals of different species and their environment. In the environmental sciences and ecology , studies on sustainable architectures can be found that were developed on the basis of AL, but also integrate nanotechnology and biotechnology in addition to other disciplines . AL applications also exist for recycling tasks. Other fields of application are resource management and land use. The living conditions of the biosphere were simulated on a global level ( daisy world ).

Autonomous Robotics

Even if the reference to AL is not mentioned in individual cases, robotics counts as AL. This is especially true when machines are capable of learning or adapting. Robots operating in groups with swarm intelligence capabilities are a rapidly growing field that is being commercialized. The natural behavior in swarms of insects is used here. An example is plant pollination by robot bees .

Computer viruses

Computer worms and computer viruses can be described as a very common form of artificial life . Both reproduction and evolution (two conditions for artificial life) exist in this type of computer program. Primitive ways of exchanging information have also already been developed in computer viruses. In the current arms race between developers on both sides, however, there is “no evidence of an effective, autonomous evolution of 'free-living' malware”.

Transportation and traffic

The modern transport sector offers a wide range of optimization options using AL methods. This applies to different optimization tasks and outside of real traffic routes also for communication networks, especially for the Internet. Here, too, the behavior of insect states is used.

In the transport sector, mass phenomena such as traffic jams are simulated in the context of virtual worlds , but also the behavior of crowds of pedestrians in large cities.

Computer games

Commercially, forms of Artificial Life are increasingly being used in computer games with life simulation , e.g. B. in the computer game Creatures , in which primitive, adaptive artificial beings with metabolism and genome were created. The evolutionary Tierra and Tamagotchi should also be mentioned here. One of the most famous, very simple simulations is the cellular automaton based Game of Life by John Horton Conway .


The idea of ​​artificial living beings is old and a topos of myths and legends , fairy tales and sagas and works between colportage and world literature as well as film art. In the vast majority of human cultural history, it was by no means uncontested across all religions that all life on earth is of divine origin. Rather, until the 19th century, it was taken for granted in the Christian religion at all times that simple living beings can and may also be produced spontaneously from dead matter under suitable conditions. With this view it was considered a trivial fact that every person with everyday knowledge can produce small organisms such as molds, maggots or worms after a few days from decaying meat, vegetable waste, excrement, mud or dirt at suitable temperatures. Accordingly, there were innumerable written instructions on how such life can be generated by humans. This even included the production of frogs and mosquitoes from water and earth or the formation of "carnivorous animals" from curdling breast milk (Revelation of Peter). It does not matter whether these views can be considered scientifically obsolete today, but only that people were unequivocally convinced of them at the time.

Antiquity and the Middle Ages - myths and spontaneous emergence of life

Greek mythology is full of artificial creatures that stood by humans or superman. In addition to the creations of the gods, the creations of artists and geniuses play a central role. Examples from antiquity are the creatures of Hephaestus , the animated automaton dolls of Daidalos and the statue of Pygmalion coming to life . In the legends and sagas of the Middle Ages, numerous lively and multi-talented artificial beings appear. For example, there are speaking heads, some with the ability to prophesy. Aristotle explains in his work De generatione animalium (On the origin of animals) using the example of shell amoeba , how these can be generated from purely material principles on the basis of his natural philosophy. In the Hindu epic Mahabharata it is described how one hundred human clones were created within a month from a broken up dead lump of meat with the help of a hundred pots and clarified butter. The golem , whose origins are in the dark in the Middle Ages, is a mute human-like being made of clay, which is often of enormous size and strength and can carry out orders. The Arab engineer al-Jazarī created the first programmable humanoid robot in the 12th century and Albertus Magnus is written that he was in possession of a bronze head ("Brazen Head") that could answer questions and a mechanical servant that served visitors Door opened and she said hello. In the 14th century, Konrad von Megenberg described in detail how bees can be correctly made from the rotting meat of the bellies of young forest cattle with covered dung or from ox skins that have to be placed in the ground and what should be avoided in the process. At Leonardo da Vinci's inventions include a robot or mechanical knight standing, sitting and could move his arms independently.

17th and 18th centuries - animal machines and human machines

With René Descartes in the 17th century a strictly mechanistic worldview of life arose, according to which humans and animals were seen as clockwork. Francis Bacon pursued a programmatic, multi-level approach to the creation of artificial life based on this in his utopian writing Nova Atlantis , created in 1623 . His concept, which had become very well-known, pursued in the last stage the creation of snakes, worms, flies and fish, which were ultimately to develop into sexual birds and quadrupeds. For centuries, Bacon's program represented the most comprehensive synthetic claim of artificial life creation and was comparable to the visions of today's synthetic biology.

In the 17th and especially in the 18th century, with the breakthrough of various technical innovations, the machine people and human machines whose ancestors could already be admired in antiquity multiply . Various automatons were developed in the Renaissance and Baroque periods , some of which could perform complicated actions. Jacques de Vaucanson from Geneva presented an artificial flute player in 1738, and in the same year Jacques de Vaucanson presented the mechanical duck that could waddle, eat and digest. Engineer Wolfgang von Kempelen finally developed a chess-playing Turk who, however, turned out to be a hoax. At that time there were no ethical objections to the attempts mentioned.

Parallel to the mechanical developments, Carl von Linné presented his comprehensive classification of living beings in the 18th century. He placed particular emphasis on the reproduction of the respective species, since this way the species affiliation could be determined. The consequence was that the prevailing view of spontaneous life formation could only be accepted for microorganisms. Johann Wolfgang von Goethe reflects ideas of a humanoid with perfect human characteristics in the figure of Homunculus in Faust II . In his ballad The Sorcerer's Apprentice - probably inspired by the figure of the Golem - Goethe describes the danger and the possible consequences that can arise from an artificial living being that has gotten out of control.

19th century - evolutionary theory and organic synthetic chemistry

The 19th century saw a new wave of attention on the subject of artificial life through the novel Frankenstein by Mary Shelley in 1818. The impact of the novel continued throughout the century.

Efforts to prove ( Félix Archimède Pouchet ) or disprove ( Louis Pasteur ) the spontaneous emergence of microorganisms were intensified until the 19th century. However, due to the limited performance of the microscope , they did not come to any real result, as the multiplication of microorganisms could not yet be made visible. It was only Darwin's theory of evolution that placed life on earth in a causal historical overall context of variation and natural selection and ultimately no longer permitted any exceptions to the spontaneous emergence of life. In the second half of the 19th century, chemists, beginning with Marcelin Berthelot , were also given greater weight, who denied so-called vitalism , the conviction that life is characterized by a special organizational principle or a special vital force and therefore cannot be produced in the laboratory . Julius Eugen Schloßberger put it:

“The artificial representation from purely organic substances, if it were to succeed on a large scale, would have to be seen as the greatest triumph of the chemist; With the cooperation (synthesis) of the organic bodies made possible by this, according to scientific principles and discretion, the most important means would be provided for man to make himself as materially independent as possible from the surrounding living nature; the most extraordinary area would then be opened up for all applications of chemistry. "

- Julius Eugen Schlossberger, 1854

Soon, thousands of chemists were working on the project to mimic nature, and by 1870 10,000 new organic compounds had been synthesized. The chemist and Nobel Prize winner (1902) Emil Fischer was pioneering towards the end of the 19th century . He succeeded in producing natural substances synthetically and in this way to determine and systematize their often complex molecular structure . Emil Fischer is therefore considered to be the forerunner of today's synthetic biology.

20th Century - A multitude of heralds of artificial life

With the beginning of the 20th century, any change and creation of organisms through organic chemistry became a vision. It was about surpassing and dominating nature. In 1907, Fischer put it this way: chemical-synthetic biology is "far superior to nature through its purposeful variation, and we are now able to call creatures into existence that correspond more to our wishes than what we have previously found on our dear earth." In 1905, the British physicist John Benjamin Butler Burke (* 1873) claimed in Cambridge that he had created artificial life with the help of radioactive radium , which had been discovered a few years earlier . Charles C. Price (1913–2001), President of the American Chemical Society, called in 1965 to make life production a national research goal. Shortly afterwards, the biologist James Frederic Danielli (1911–1984) announced the idea of ​​chemically synthesizing complete chromosomes or genomes and transforming them into a target cell. According to him, this would enable the comprehensive goals that synthetic biology later repeatedly propagated to be realized .

The well-known Miller-Urey experiment by Stanley Miller and Harold Urey , with which it was possible for the first time to synthesize amino acids in the laboratory in a simulated primordial atmosphere with the aid of electrical discharges, caused a stir . The discovery of the DNA structure in the same year was a new milestone on which the further development of the idea of ​​artificial life was based. In 1972 Paul Berg succeeded in the first recombination of bacterial DNA; this laid the foundation for genetic engineering . In 1967 the American biochemist and 1959 Nobel Prize for Medicine Arthur Kornberg hit the media headlines for the synthesis of a viral genome, and in 1970 Danielli claimed to have carried out the first artificial synthesis of a living cell, while at the same time pointing out the potential dangers of such experiments pointed out. In summary, for broad strata of the population, the past century was also shaped by a large number of announcements on the science side, based on successful literary events such as Aldous Huxley's science fiction novel Brave New World (1932) or Martin Caidin's science fiction novel Cyborg (1972) that the creation of artificial life has either succeeded or is easy to accomplish and is imminent.

21st Century - Minimal Genome and AL Research Establishment

Minimal genome and AL research discipline

Today, the artificial creatures in the form of robots and software agents have become a matter of course. Their classification in the history of ideas of artificial life is a plausible, but by no means self-evident view. The synthetic biology to enable the creation of artificial life of a day. However, what synthetic biology is currently doing is far from the mythological beings described above. The focus of the research is on bacteria with artificial genetic makeup , as presented in the media for the first time in 2008 in a US research group led by Craig Venter and the Nobel Prize for Medicine, Hamilton O. Smith . The method used here is a systematic modification of life through the gradual elimination of genome sequences. The result is a reproductive minimal genome , which in principle can be expanded to include different genetic functions for certain (commercial) tasks. Synthetic biology sees these tasks in the repeatedly touted areas of climate protection (e.g. CO 2 reduction), health (e.g. artificial vaccines), the environment (e.g. garbage disposal or removal of crude oil in the sea) and nutrition. The claim to be able to create artificial life with this method has been discussed critically. A distinction from conventional genetic engineering cannot be clearly drawn, so that here from various sides only life modification is spoken of. Furthermore, the approach is accused of an already overcome gender tightness, according to which life forms and processes can be fully explained from the number, arrangement and interaction of genes or the cell can be reduced to a genome.

Development of AL research discipline

See: #Artificial life research program

Top-down versus bottom-up approach

Attempts of the kind presented by Venter and colleagues are referred to as a top-down approach , in contrast to the bottom-up approach by Petra Schwille , for example , who attempts to create minimal versions of an artificial cell with the exemplary functions of artificial cells from individual biomolecular steps Cell membrane and artificial cell division . AL systems are typically bottom-up, in contrast to the typically top-down AI systems with a central command instance (control). Compared to AI systems, AL systems are implemented as autonomous low-level agents that interact with each other simultaneously and whose decisions are based on information from this communication. Complex multicellular life forms such as mammals with many simultaneous life characteristics are not possible in the foreseeable future using artificial means. The anticipated progress with genome editing ( CRISPR / Cas method ) will make it even more difficult to distinguish between modification and renewal of life.


In science, artificial life is used to study certain aspects of biological life. In the vision of software engineers, the computer should allow any modeling of life structures and the environment. The justification for the procedure of creating life artificially is the principle Verum quia factum of the Italian philosopher Giambattista Vico . According to this, only what we have made ourselves can be recognizable as true. However, according to Joachim Schummer, evidence that life has to be artificially created in order to be able to understand it better has not yet been provided. Schummer and others critically emphasize that information technology, which is concerned with the production of artificial life, does not contribute precisely to the distinction between visions, simulations or real developments. According to Schummer, AL is ultimately based on an outdated, strictly causal, deterministic basic conviction according to which, on the one hand, life can be completely and clearly broken down into functional modules and, on the other hand, all these components are determined by one or by combined gene sequences.

See also


  • O. Awodele, OO Taiwo, SO Kuyoro: An Overview of Artificial Life. International Journal of Advanced Studies in Computer Science and Engineering (IJASCSE), Volume 4, Issue 12, 2015 PDF
  • Wendy Aguilar, Guillermo Santamaría-Bonfil, TomFroese and Carlos Gershenson: The past, present, and future of artificial life. published: 10 October 2014 ROBOTICS AND AI doi : 10.3389 / frobt.2014.00008
  • Joachim Schummer. The work of God. The artificial production of life in the laboratory. Suhrkamp Berlin. Edition Unseld Volume 39. 2011. ISBN 978-3-518-26039-5 .
  • Christoph Adami: Introduction to Artificial Life. Springer, New York NY a. a. 1998, ISBN 0-387-94646-2 , (with 1 CD-ROM (12 cm)).
  • Steven Levy : KL - Artificial Life from the Computer. Droemer Knaur, Munich 1993, ISBN 3-426-26477-3 .


Web links

Individual evidence

  1. [differentiation between two concepts of artificial life].
  2. ^ The MIT Encyclopedia of the Cognitive Sciences , The MIT Press, p.37. ISBN 978-0-262-73144-7
  3. Mark A. Bedau: Artificial life: organization, adaptation and complexity from the bottom up (PDF) TRENDS in Cognitive Sciences. November 2003. Archived from the original on December 2, 2008. Info: The archive link was automatically inserted and not yet checked. Please check the original and archive link according to the instructions and then remove this notice. Retrieved January 19, 2007. @1@ 2Template: Webachiv / IABot /
  4. Maciej Komosinski and Andrew Adamatzky : Artificial Life Models in Software . Springer, New York 2009, ISBN 978-1-84882-284-9 .
  5. Andrew Adamatzky and Maciej Komosinski: Artificial Life Models in Hardware . Springer, New York 2009, ISBN 978-1-84882-529-1 .
  6. a b c d e f g h i j k l m Wendy Aguilar, Guillermo Santamaría-Bonfil, Tom Froese, Carlos Gershenson. The past, present, and future of artificial life. Front. Robot. AI, 10 October 2014. doi : 10.3389 / frobt.2014.00008
  7. a b c d e f g h i j Marc A. Bedau: Artificial life: organization, adaptation and complexity from the bottom up. TRENDS in Cognitive Sciences Vol.7 No.11 November 2003 PDF
  8. a b c d e f g h i j k l m n o Wolfgang Banzhaf, Barry McMullin: Artificial LIfe in Grzegorz Rozenberg, Thomas Bäck, Joost N. Kok (Eds.): Handbook of Natural Computing. Springer 2012. ISBN 978-3-540-92909-3 (print) 978-3-540-92910-9 (online)
  9. ^ Norbert Wiener: Cybernetics, Ort Control and Communication in the Animal and the Machine. Wiley. 1948
  10. ^ Alan M. Turing (1952): The Chemical Basis of Morphogenesis. Philosophical Transactions of the Royal Society of London, series B, Volume 237, No. 641, pp. 37-72, doi : 10.1098 / rstb.1952.0012 .
  11. ^ Gierer, Alfred and Meinhardt, Hans 1972: A Theory of Biological Pattern Formation. Cybernetics 12, 30-39
  12. Meinhardt, H., 1982. Models of Biological Pattern Formation. Academic Press, London
  13. ^ John von Neumann: Theory of Self-Reproducing Automats. University of Illinois Press. 1966
  14. Stephen Wolfram. Computational Theory of Cellular Automata. Commun. Math. Phys. 96: 15-57 (1984) PDF
  15. ^ Christopher Langton: Studying Artificial Life with Cellular Automata. Physics 22ID: 120-149
  16. Christopher Langton: What is Artificial Life? (1987) pdf ( Memento of March 11, 2007 in the Internet Archive )
  17. ^ Langton, CG (1998). A new definition of artificial life. PDF
  18. ^ Mark A. Bedau, John S. McCaskill, Norman H. Packard, Steen Rasmussen, Chris Adami, David G. Green, Takashi Ikegami, Kunihiko Kaneko, Thomas S. Ray. Open Problems in Artificial Life. Artificial Life 6: 363-376 (2000)
  19. Steen Rasmussen, Nils A. Baas, Bernd Mayer & Martin Nilsson: Approach for dynamical hierarchies. In: Artificial life. Vol. 7, No. 4, 2001, pp. 329-353, DOI: 10.1162 / 106454601317296988 .
  20. ^ J. Maynard Smith, Eörs Szathmáry: The Major Transitions in Evolution. Oxford University Press, New York 1995, ISBN 0-19-850294-X
  21. a b c Domenico Parisi. Artificial life and higher level cognition. Brain cogn. 1997 Jun; 34 (1): 160-84. doi : 10.1006 / brcg.1997.0911
  22. Thomas Nagel . Mind and cosmos. Why the materialistic neo-Darwinian conception of nature is almost certainly wrong. Suhrkamp paperback science. 2016. ISBN 978-3-518-29751-3
  23. Ghassan Hamarneh, Chris McIntosh, Tim McInerney, and Demetri Terzopoulos Deformable Organisms: An Artificial Life Framework for Automated Medical Image Analysis in: Chapman & Hall / CRC, Boca Raton, FL p 433 PDF
  24. Semple JL, Woolridge N, Lumsden CJ .: In vitro, in vivo, in silico: computational systems in tissue engineering and regenerative medicine. Tissue Eng. 2005 Mar-Apr; 11 (3-4): 341-56.
  25. Jump up ↑ Mark A. Bedau, John S. McCaskill, Norman H. Packard, Steen Rasmussen, Chris Adami, David G. Green, Takashi Ikegami, Kunihiko Kaneko, and Thomas S. Ray. Open Problems in Artificial Life. Artificial Life, Fall 2000, Vol. 6, No. 4, Pages: 363-376
  26. Konstantinos Dermitzakis, Marco Roberto Morales, Andreas Schweizer. Modeling the Frictional Interaction in the Tendon-Pulley System of the Human Finger for Use in Robotics. Artificial Life. 2013, Vol. 19, No. 1, Pages: 149-169
  27. Rachel Armstrong: Systems Architecture: A New Model for Sustainability and the Built Environment using Nanotechnology, Biotechnology, Information Technology, and Cognitive Science with LivingTechnology. ArtificialLIfe: 16. 73-87 (2010) PDF
  28. K. Okuhara, E. Domoto, N. Ueno, H. Fujita: Recycling design using the artificial life technology to optimize evaluation function. 2003 EcoDesign 3rd International Symposium on Environmentally Conscious Design and Inverse Manufacturing
  29. ^ Bousquet, F., and Page, CL (2004). Multi-agent simulations and ecosystem management: a review. Ecol. Model. 176, 313-332. doi : 10.1016 / j.ecolmodel.2004.01.011
  30. Matthews, R., Gilbert, N., Roach, A., Polhill, JG, and Gotts, N. (2007). Agent-based land-use models: a review of applications. Landsc. Ecol. 22, 1447-1459. doi : 10.1007 / s10980-007-9135-1
  31. M. Eaton, J. Collins (2009) Artificial life and embodied robotics; current issues and future sound ranges. Artig Life Robot 13 (2): 406-409
  32. Malachy Eaton. Further explorations in evolutionary humanoid robotics. Artificial Life and Robotics. March 2008, Volume 12, Issue 1, pp 133-137
  33. Alecksa erikson: Robotic Bees are now beeing built to polinate vrops instead of real bees. (October 5, 2016)
  34. Joachim Schummer. The work of God. The artificial production of life in the laboratory. Suhrkamp Berlin. Edition Unseld Volume 39. 2011. ISBN 978-3-518-26039-5 . P. 32
  35. Joachim Schummer. The work of God. The artificial production of life in the laboratory. Suhrkamp Berlin. Edition Unseld Volume 39. 2011. ISBN 978-3-518-26039-5 . P. 33
  36. Joachim Schummer. The work of God. The artificial production of life in the laboratory. Suhrkamp Berlin. Edition Unseld Volume 39. 2011. ISBN 978-3-518-26039-5 . P. 45f.
  37. Delrio 's 1599 Disquis. Magic. , Vol. I, Ch. Iv., P. 31.
  38. Joachim Schummer. The work of God. The artificial production of life in the laboratory. Suhrkamp Berlin. Edition Unseld Volume 39. 2011. ISBN 978-3-518-26039-5 . P. 40f.
  39. en: Leonardo's robot
  40. Joachim Schummer. The work of God. The artificial production of life in the laboratory. Suhrkamp Berlin. Edition Unseld Volume 39. 2011. ISBN 978-3-518-26039-5 . Pp. 42-44.
  41. Joachim Schummer. The work of God. The artificial production of life in the laboratory. Suhrkamp Berlin. Edition Unseld Volume 39. 2011. ISBN 978-3-518-26039-5 . P. 62f.
  42. ^ Julius E. Schlossberger (1854). Textbook of organic chemistry, 3rd ed. Stuttgart, Müller p.27.
  43. Joachim Schummer. The work of God. The artificial production of life in the laboratory. Suhrkamp Berlin. Edition Unseld Volume 39. 2011. ISBN 978-3-518-26039-5 . P. 73
  44. Joachim Schummer. The work of God. The artificial production of life in the laboratory. Suhrkamp Berlin. Edition Unseld Volume 39. 2011. ISBN 978-3-518-26039-5 . P. 74f.
  45. Joachim Schummer. The work of God. The artificial production of life in the laboratory. Suhrkamp Berlin. Edition Unseld Volume 39. 2011. ISBN 978-3-518-26039-5 , pp. 77f.
  46. ^ Emil Fischer, "Festrede" (1907) in: Emil Fischer: Papers, Bancroft Library of the University of California, quoted in. after Joachim Schummer 2011. p. 76.
  47. Joachim Schummer. The work of God. The artificial production of life in the laboratory. Suhrkamp Berlin. Edition Unseld Volume 39. 2011. ISBN 978-3-518-26039-5 , pp. 87f.
  48. ^ Daniel G. Gibson et al .: Complete Chemical Synthesis, Assembly, and Cloning of a Mycoplasma genitalium Genome. In: Science . Volume 319, No. 5867, 2008, pp. 1215-1220, PMID 18218864 , doi: 10.1126 / science.1151721
  49. Joachim Schummer. The work of God. The artificial production of life in the laboratory. Suhrkamp Berlin. Edition Unseld Volume 39. 2011. ISBN 978-3-518-26039-5
  50. From Natural to Artificial Cells - Petra Schwille Appointed New Director at the MPI of Biochemistry Max Planck Institute of Biochemistry October 13, 2011
  51. Joachim Schummer. The work of God. The artificial production of life in the laboratory. Suhrkamp Berlin. Edition Unseld Volume 39. 2011. ISBN 978-3-518-26039-5 , p. 104.
  52. Joachim Schummer. The work of God. The artificial production of life in the laboratory. Suhrkamp Berlin. Edition Unseld Volume 39. 2011. ISBN 978-3-518-26039-5 , p. 103.