Neuroinformatics

from Wikipedia, the free encyclopedia

The neuro computer science is a branch of computer science and neurobiology , which deals with information processing in neural systems that apply to them in technical systems. Neuroinformatics is a strongly interdisciplinary research area at the interface between AI research and cognitive science . Similar to neural AI, neuroinformatics is about the inner workings of the brain. Its working method is examined by simulating its basic building blocks, neurons and synapses , and their interconnection.

Neuroinformatics is a neighboring field of computational neuroscience , which as a branch of neurobiology deals with the understanding of biological neuronal systems using mathematical models. It is also to be distinguished from the discipline known as neuroinformatics in the English-speaking world , which deals with the organization of neuroscientific data using computer methods.

Sub-areas of neuroinformatics

Neural methods are mainly used when it comes to extracting information from bad or noisy data, but algorithms that adapt to new situations, i.e. learn , are typical of neuroinformatics. A basic distinction is made between supervised learning and unsupervised learning , a compromise between the two techniques is reinforcement learning . Associative memories are a special application of neural methods and are therefore often the subject of research in neuroinformatics. Many applications for artificial neural networks can also be found in pattern recognition and especially in image understanding .

Universities

Neuroinformatics is a relatively young and small part of computer science, but institutes, departments or working groups for neuroinformatics can be found at many universities .

Biological foundations of neural networks

Dendritic branches of two neurons, each of which has an axon emerging from its cell body.

Nerve cells are found in the body of all tissue animals and are understood as the basic functional units of their nervous system . A typical neuron consists of three parts:

In a brain , the mostly gray-brown nerve cell bodies - as an accumulation of perikarya in a similar place, also called nuclei - form the so-called gray matter , and the lighter myelinated nerve cell processes - as fiber bundles of axons with a similar course also called pathways - the so-called white matter . The dendrite trees are spread out in the neighborhood with different degrees of branching and so do not form a macroscopically conspicuous aspect.

Neurons are linked by synapses at the points where the excitation is transmitted from one neuron to another . The electrical signal is seldom passed on directly, but mostly carried across the 20–30  nm synaptic gap with the help of a neurotransmitter . In such chemical synapses, a distinction is made between excitatory and inhibitory, inhibitory synapses based on the response of the downstream ( postsynaptic ) cell . In a nerve cell, the membrane potential changes (postsynaptic potentials) caused by various synapses are summed up over the cell body and offset against the axon hill. Will be taking a certain threshold is exceeded, one is in neuron action potential which through its axon triggered forwarded is.

Modeling of neural networks

There are many different models to neural networks to model. One approach is to interconnect a number of artificial neurons to form a network. Depending on the question, these neurons can be oriented differently to the biological conditions. But there are also many other types of artificial neural networks:

Networks with teacher
Networks with competition
Networks with feedback

literature

  • Russell Beale, Tom Jackson: Neural Computing. An Introduction. Adam Hilger, Bristol et al. 1990, ISBN 0-85274-262-2 .
  • Simon Haykin : Neural Networks. A Comprehensive Foundation. 2nd edition. Prentice Hall, Upper Saddle River NJ 1999, ISBN 0-13-273350-1 .
  • John Hertz, Anders Krogh, Richard G. Palmer: Introduction to the Theory of Neural Computation (= Santa Fe Institute Studies in the Sciences of Complexity. Lecture Notes. 1). Addison-Wesley, Redwood City CA et al. 1991, ISBN 0-201-51560-1 .
  • Christof Koch : Biophysics of Computation. Information processing in single neurons. Oxford University Press, New York NY 1999, ISBN 0-19-976055-1 .
  • Burkhard Lenze: Introduction to the mathematics of neural networks. 3rd, revised and revised edition. Logos, Berlin 2009, ISBN 978-3-89722-021-8 .
  • Raúl Rojas : Theory of Neural Networks. A systematic introduction. 4th, corrected reprint. Springer, Berlin et al. 1996, ISBN 3-540-56353-9 ( English edition online ).
  • Philip D. Wasserman: Advanced methods in neural computing. Van Nostrand Reinhold, New York NY 1993, ISBN 0-442-00461-3 .
  • Andreas Zell: Simulation of neural networks. Addison-Wesley, Bonn et al. 1994, ISBN 3-89319-554-8 (at the same time: Stuttgart, University, habilitation paper, 1994).
  • Peter-Michael Ziegler: IBM wants to recreate the brain . hot. 2008. Retrieved April 2, 2014.

Web links

  • Examples : HTW Dresden - a lot of student work on various topics.