Neural network

from Wikipedia, the free encyclopedia
Basic structure of cortico-cortical association and commissure fibers in the connectome model of the human cerebral cortex
Neural connections in the nervous system of the roundworm Caenorhabditis elegans : network of all of its 300 or so nerve cells
Linking neurons via synapses
Neuritic and dendritic branches of their cell processes characterize the shape of neurons, here the auditory cortex (drawing by Cajal , 1898)

In neuroscience, a neural network is any number of interconnected neurons that, as part of a nervous system, form a connection that is supposed to serve a specific function. In Computational Neuroscience, this also includes simplified models of a biological network.

In computer science , information technology and robotics , their structures are modeled as an artificial neural network and technically reproduced, simulated and modified.

The networking of neurons

The nervous system of humans and animals consists of nerve cells (neurons) and glial cells and an environment. The neurons are linked to one another via synapses , which can be understood as connecting points or nodes of an interneuronal network . In addition, there is an exchange in chemical and electrical form between neurons and cells of the neuroglia , in particular oligodendroglia and astroglia , which can change the weighting of signals.

The “circuit technology” of neurons usually knows several inputs and one output. If the sum of the input signals exceeds a certain threshold value , the neuron “fires” ( excitation generation ): An action potential is triggered on the axon hill , formed in the initial segment and transmitted along the axon ( excitation conduction ). Action potentials in series are the primary output signals from neurons. These signals can be transmitted to other cells via synapses ( transmission of excitation ). An electrical synapses that are potential changes passed in direct contact. At chemical synapses , these are converted into a transmitter quantum as a secondary signal, i.e. transmitted by messenger substances (transmission).

Schematic representation of a simple neural network.
Divergence (green): A neuron transmits signals to several other neurons.
Convergence (blue and yellow): A neuron receives signals from several others.

Characteristic of nerve cells are their cell extensions , with which contacts to individual other cells are made. As dendrites , they are primarily used to receive signals from other cells, while signals to other cells are passed on via the neurite , also called axon in the envelope by glial cells.

With branches of its axon as axon collateral , the signal of a neuron can be transmitted efferent to several other neurons (divergence). A neuron can also receive afferent signals from various other neurons (convergence), primarily via its dendrites as inputs.

While the action potential transmitted across the membrane is promptly transferred as an electrical signal in electrical synapses , in chemical synapses it is first converted into a secondary chemical signal at the presynaptic membrane region of a neuron. This takes place as potential- dependent neurocrine secretion through the release ( exocytosis ) of the molecules of a signal substance held in stock in synaptic vesicles .

After bridging the narrow synaptic gap by diffusion, this messenger substance acts as a neurotransmitter - or as a neuromodulatory cotransmitter  - on the membrane region of the postsynaptically assigned cell, if it is equipped with suitable receptor molecules and receptive to it.

With the receptor binding, a transmitter is recognized and directly ( ionotropically ) or indirectly ( metabotropically ) a temporary regional change in membrane permeability is initiated. Small ions flowing in or out through the membrane cause changes in potential postsynaptically, as local electrical signals. The incoming signals pass to the membrane along a neuron, are spatially here as integrated over time and cumulatively combined.

Such postsynaptic potentials develop differently, depending on the membrane equipment with receptors and ion channels . As graduated potentials, they can not only be signals of different strength, but also fundamentally different in quality: excitatory stimuli stimulate the generation of arousal, inhibitory ones inhibit the formation of an action potential.

With this form of synaptic connection as chemical transmission, signals are given a sign . Furthermore, they can be weighted, strengthened or weakened depending on the process at the connection point. Frequently repeated transmission in rapid succession can lead to longer-lasting changes which, as long-term potentiation, intensify the synaptic transmission. At a low frequency, changes can occur in different ways, which lead to a permanent weakening of long-term depression . In this way, the signal transmission process itself can shape or reshape the synaptic mode ( neural plasticity ). The networking of neurons does not show a rigid connection , but a weighting of the signal path that depends on the previous state and changes with repeated use.

Learn

There are various, now well-standardized theories about learning in neural networks. The first neural learning rule was described by Donald O. Hebb in 1949 ( Hebb's learning rule ); significant developments took place a. through the work of Finn Teuvo Kohonen in the mid-1980s.

This resulted in typical properties of neural networks that apply equally to natural and artificial "neural systems" . This includes the property that they can learn complex patterns without any abstraction about the rules that may be based on these patterns. This means that neural networks do not follow a logic system , but a (in a certain sense intuitive) pattern processing ; s. a. Artificial intelligence . This also means that the rules do not have to be developed before learning . On the other hand, a possible logic that determined its learning success cannot be determined retrospectively from the neural network.

Again, this does not mean that logical behavior and precise rules cannot be learned or applied by neural networks. These only have to be laboriously worked out through training; for example when learning the grammar of a language over the years. Neural networks do not learn explicitly, but implicitly: the grammar of the mother tongue is first learned implicitly by a toddler. As a schoolchild, they generally learn the rules - again - explicitly, through rules.

The following applies especially to the simulation of artificial neural networks in science and technology:

  • The "correct" training of a neural network is a prerequisite for learning success or for the correct processing of a pattern in a nervous system.
  • Conversely, a prediction about the “correct” interpretation of a pattern by a neural network is not precisely possible as long as this specific network is not used or calculated with this specific learning experience. Neural networks thus have the problem that after the learning process, patterns that do not resemble the models implemented in the learning set cause stochastic (ie apparently “random”) behavior of the output neurons. They do not work exactly, but approximately.

research

  • A neural network was first presented in 1894.
  • The study of the biochemical and physiological properties of neural networks is a subject of neurophysiology .
  • In neuroinformatics and research on artificial intelligence , the functionality of neural networks is simulated by artificial neural networks using software in the computer or the properties of neural networks are made usable for software applications (see also applications of artificial intelligence ).
  • A conceptual abstraction of neural networks also takes place in theoretical biology .
  • In particular, in Computational Neuroscience, model neurons , which have different degrees of abstraction from the biological conditions, are connected to networks using simulated synapses in order to examine their dynamics and their ability to process information and data . In the case of mathematically simple models, this is done through mathematical analysis, but mostly also through computer simulations.
  • In the 1980s and 1990s, physicists also got into this field and made a significant contribution to understanding. Neural networks are currently used for analysis in high-energy physics. So-called multivariate methods are an important part of separating experimental data.
  • Artificial neural networks, which are a simulation of natural neural networks, are now often used to better study the functioning of neural networks, since experiments can be carried out with the artificial systems that natural systems do not allow.

See also

literature

  • CW Eurich: What does a cat see? [Neural coding and reconstruction]. Brain & Mind , 3/2003.
  • Sven B. Schreiber: Natural intelligence. Neurons and Synapses - All Just an Organic Computer? (Part 1), c't - magazine for computer technology, 1987 (4), pp. 98-101.

Web links

Commons : Neural network  - album with pictures, videos and audio files

Individual evidence

  1. Olaf Breidbach : Hirn, brain research. In: Werner E. Gerabek , Bernhard D. Haage, Gundolf Keil , Wolfgang Wegner (eds.): Enzyklopädie Medizingeschichte. De Gruyter, Berlin / New York 2005, ISBN 3-11-015714-4 , p. 600 f .; here: p. 600 (and p. 1543).
  2. Siegmund Exner : Draft for a physiological explanation of psychological phenomena. 1894; Reprint: Wissenschaftlicher Verlag Harri Deutsch, Frankfurt am Main 1999, p. 193.