information

from Wikipedia, the free encyclopedia
The " i " is an international symbol for information in tourism and related areas

Information in the information theory , the knowledge that a sender to a receiver over a channel of information conveyed. The information can take the form of signals or code . The information channel is a medium in many cases . For the recipient, the information leads to an increase in knowledge.

Information can be consciously transmitted as a message or message from a sender to a recipient or it can also be transported unconsciously and attract attention through the perception of the shape and properties of an object. Information receives its value through the interpretation of the overall event on various levels by the recipient of the information. Sender or receiver can not only be persons / humans, but also (more highly developed) animals or artificial systems (such as machines or computers / computer programs).

Definitions

Since the concept of information has often been defined, some classic definition approaches are presented, which at the same time correspond to the different meanings of information:

  • The definition “Information is the subset of knowledge that is required by a certain person or group in a specific situation and is often not explicitly available” focuses in particular on the need and the novelty value from the perspective of the recipient (user).
  • "Information is the reduction of uncertainty due to technical information processes" is primarily related to the communication process, i.e. the activity of the sender .
  • Harald H. Zimmermann advocates a user-oriented approach that focuses on the action-relevant change in knowledge: "Information is the (successful) transfer of knowledge," is the (new) knowledge that leads to a change in previous knowledge in the recipient. In a narrower sense, it is the knowledge that a person (or an institution) previously lacked in order to make an appropriate decision on a current problem.
  • With "Information is knowledge in action", Rainer Kuhlen makes the action aspect of information clear.

Further definitions of information can be found in the literature in various contexts:

  • In, 'information' is defined as “that part of a message that is new to the recipient”.
  • In, the message must not be redundant (news value) and must also be relevant (pragmatics).
  • In basic library knowledge, information is referred to as the content transmitted by the various media.

'Information' is also used as a general term for data, both expressions are often assumed to be synonymous. This also gave rise to expressions such as information technology, information flow, etc. - but these mostly refer to data. The term 'information processing' only makes sense if information is understood as a variant of date and message. But information is also compared as a higher order to the data - from which it is composed.

In addition, the term “information” (also in the plural) is a generic term for numerous documents / expressions with more specific meanings and terms depending on the situation; Examples are notice, information, report, notification, etc.

Focus of meaning

The term "information" is used in detail with different but closely related meanings . After it is used:

  • for the activity of informing .
  • Furthermore, information can mean the information channel.
  • In addition, in an understanding of information related to the recipient , this can be the intended and to be achieved (knowledge) change at the recipient .
  • Finally, information can be understood in relation to the actual message [arguably the most widely used meaning]. This understanding is related to that of the process, but does not mean the (physical) communication channel, but rather what is sent via it.

For more information, see the examples described below .

properties

Energy , matter and information represent the three most important basic concepts in the natural and engineering sciences. For computer science, which sees itself as the science of the systematic processing of information, the concept of information is of central importance; nevertheless it has hardly been specified so far. Much can be said about them. "(Gradually)

  • It serves the purpose of increasing the knowledge of the potential or actual user or reducing their ignorance ( entropy ) - if necessary for the realization of a specific project or an action (“action-defining”), e.g. B. To make a decision.
  • It is “of value” to us when it expands our knowledge of the world: it conveys a difference , news is what's different .
  • If it is a prerequisite for certain actions, it is often requested or 'accessed' by the recipient on the initiative.
  • The recipient can reduce their amount according to their interest in knowledge (e.g. "filter" , use only partially) or expand or link them with the help of other information.
  • Information does not need a fixed carrier. The information is not the information medium , but what the medium "transports".
  • It is "dialogical", ie sender and user-related - and thus communication- dependent : without a functioning communication channel, the information sent by the sender does not reach the recipient.
  • It arises from the transmission of matter ( microscopic and macroscopic ), of energy or of impulses. It reaches people via the sensory organs and, in a chemically biological sense, via receptors and nerves .
  • Information can be copied any number of times, it knows no originals.
  • Information does not age; nevertheless it can become out of date - and is then z. B. replaced by new information (price of goods)
  • Information can be combined in almost any way. You cannot tell whether its parts belong together; Any manipulation is therefore possible.
  • Information can be compressed a lot - but it can also be rolled out empty of content.

In a broader sense, the criteria that determine the quality of information are also among the properties that information can / should have. These are, for example: purpose orientation, truth / correctness, completeness, consistency (free of contradictions), credibility and verifiability, topicality.

Examples

Info box on the top of the dam wall of the Kölnbreinsperre in the Austrian Maltatal. Information for diversion and edification is promised here.

The following examples explain in detail the essentials of information:

  • Traffic sign (e.g. arrow signpost no. 418 ) at an intersection: The indication A-Stadt 12 km is conveyed by the visual transport ("information channel") of the perceived sign - its text, color and shape (directional arrow ) - to the (interested) road users for information, consisting of the code (letters and so on), the syntax (words, distance information, arrow direction) and the semantics (points to ...). It expands their knowledge and reduces their ignorance (where is it going? How far is it? Turn right or left? ...). Simply “seeing” this sign (as a medium) or even failing to perceive it makes the sign and its contents just as little information as if the sign were in a drawer.
  • Book / newspaper: The reader absorbs a lot of information as an extension of his knowledge. He does this after a conscious search (non-fiction book, lexicon) or simply by reading (interesting news also in the novel), in both cases only in excerpts. Information often does not appear as a singular term, but often reaches us in large quantities (also in news programs, etc.). It arises by chance through perception or specifically on the initiative of the recipient or sender.

Further examples:

  • Information boxes in tourism: The audio box (as an information channel) emits audible signals that convey knowledge to the observer in a purposeful manner (about this building).
  • Prices for a product in the shop window: Price information is "data" which, when perceived by interested passers-by, becomes information for them.
  • Time: The clock as a medium shows "data" in a certain form (code; digital, analog). The time is used for information purposes for a viewer interested in the time; it has a meaning for him.

Structure and meaning

One perspective is based on the information carrier. The question of which structure can be determined within this carrier is examined. Another approach tries to understand the importance of what one then (somehow) extracted from this information carrier.

The first perspective has its roots in communications technology, the second in cognitive science , linguistics or, in general, in humanities . A structure that can be recognized by communications technology (for example, light pulses that hit individual cells in the retina in a chronological order ) must be translated into a meaning in a complex decoding process.

One of the exciting questions in information and cognitive science is where the pure structural information ends here and begins to become meaning information, i.e. where the line to consciousness is to be drawn in this decoding process .

These considerations result in four levels under which the concept of information is generally considered today. These are

  1. Coding
  2. syntax
  3. semantics
  4. Pragmatics

These levels increase in terms of the meaning of the information. They also reflect the theoretical points of attack mentioned above, whereby the coding level comes close to the view of communications technology, the syntax level reflects the view of linguistics or the theory of formal languages, the semantic level integrates approaches from semiotics or semantics, and pragmatics more draws on concepts from cognitive science.

The four levels should be explained using the string "IT'S WARM":

Code level

In this regard, the level of observation “coding” means: The appearance in which the (potential) information reaches its recipient (s) must be able to be identified and what is perceived can be “decoded”. The information “It's warm” can be transmitted in writing (for example as part of a newspaper article) or acoustically (via the information channel <voice, sound frequency, ears>); each consisting of characters or sounds of a specific language. The display on a thermometer (analogue display in the form of a column or in the form of a numerical degree display) and even the absolute temperature itself could be code (formats) in this context that convey “It is warm”. Other code examples would be a binary code with which such letters or a degree flow between two computer programs - or (optically / acoustically received) Morse code etc. Without knowledge of the code, what is “only perceived” cannot be interpreted and is not 'information' related to the recipient.

The string "IT'S WARM" is too short for statistical analysis. In the case of longer texts, however, it becomes clear that not all elements of the character sequence (letters) occur with the same frequency. Certain letters such as e and t - but in our example s - are more common than others. This fact can be used when transmitting information in order to save transmission time. The Huffman codes are an example . They represent a process with which information can be efficiently transmitted and stored. Many other procedures exist.

Syntactic level of information

On the syntactic level, information is only seen as a structure that needs to be conveyed. The content of the information is essentially of no interest here. For example, the problem could be to transfer the image from a camera to a monitor. The transmission system is not interested, for example, in whether the picture is worth transmitting at all (burglar messes with the window) or not (cat walks along the window sill), or whether anything can be recognized at all (including the picture a completely out of focus camera is completely transmitted, although there is actually nothing recognizable to be seen). The information content is a measure of the maximum efficiency with which the information can be transmitted without loss.

Distinctness and information content

The basic principle of syntactic information is distinguishability : information contains what can be differentiated and what can be measured. A distinction, however, requires at least two different possibilities.

If there are exactly two options, the distinction can be clarified with a single yes / no question. Example: Suppose there are only two dishes on a menu , schnitzel and spaghetti. We know that the guest has ordered one of the two dishes. To find out which one he ordered, you only have to ask him one question: “Did you order schnitzel?” If the answer is “yes”, then he has ordered a schnitzel , if the answer is “no”, he has spaghetti ordered.

If, on the other hand, there are more than two options, you can still find out which alternative applies by means of yes / no questions. A simple option would be to simply query all dishes in sequence. However, this is a rather inefficient method: If the guest has not yet placed an order, it takes a lot of questions to find out. It is more efficient if, for example, you first ask: “ Have you already ordered? "To be more specific," Was it a dish with meat? "," Was it pork? ", So that there are only a few alternatives left (" Was it pork schnitzel? "," Roast pork? "," Pork knuckle? "). The order of the questions reflects the significance of the bits in such a coded message. The information content of a message corresponds to the number of yes / no questions that are needed in an ideal questioning strategy to reconstruct it.

The probabilities also play a role in an optimal questioning strategy: if you know, for example, that half of all guests order pork schnitzel, it certainly makes sense to ask for pork schnitzel first before going through the rest of the menu.

It is interesting here that, although no semantic or pragmatic information is used ostensibly, it is implicit in the form of probability . For example, the fact that 50 percent of the guests order pork schnitzel cannot be seen from the menu; it is pragmatic information. And the fact that one does not normally ask to order “ We wish you a good appetite ” follows from the semantic information that this is not food and it is therefore highly unlikely that anyone will order it.

Binarization and the probability of signs

The string “IT'S WARM” contains only capital letters. If we assume that we only have capital letters available (i.e. 27 letters including spaces), we can put one of the 27 characters in each of the eleven positions of the above message. Each position in the message must therefore be able to map 27 possible states.

This will be explained using the example of a binary code : Each character is represented by a sequence of bits . A bit only differentiates between two possible states, which can be defined as one and zero . To be able to represent 27 different states, you need several bits, in this case there would be five; one could differentiate between 2 to the power of 5 = 32 states. The specifications for such a code could be (fictitiously) as follows:

  A=00001  B=00010  C=00011  D=00100  E=00101  F=00110  G=00111
  H=01000  I=01001  J=01010  K=01011  L=01100  M=01101  N=01110
  O=01111  P=10000  Q=10001  R=10010  S=10011  T=10100  U=10101
  V=10110  W=10111  X=11000  Y=11001  Z=11010  <LZ>=11100 (Leerzeichen)

Our message would then be

             „00101_10011_11100_01001_10011_10100_11100_10111_00001_10010_01101“  *)
entspricht:     E     S    <LZ>   I     S     T    <LZ>   W     A     R     M

*) The spaces (_) are only inserted for reasons of readability. Whether they (or other separators) are included in the message should be specified in the agreements on the format of the data transfer. Possibly. the message would only consist of 11 consecutive 5-bit combinations, i.e. 55 bits.

The coding of each letter with 5 bits each does not have to be the only valid one. In the context of classical information theory, the information sequence is considered from a statistical point of view. In this way it can be taken into account how often a certain character in the character set is used, in other words how likely it is to occur. For example, the letter "E" is more common in the German language than the letter "Y".

If one takes into account this probability of occurrence of the characters in the character set, the number of yes / no decisions required to recognize a character can be made different depending on the character. Such a coding is also called entropy coding . This means that fewer bits are required to encode a character that occurs frequently than for a character that occurs rarely. A character therefore has a higher information content (requires a higher number of 'atomic' decision units, bits for recognition) the less often it occurs. In addition, in this case z. B. agreed (and shown as a code) how / how the number of bits of the respective character can be recognized.

Semantic level of information

Structured, syntactic information can only be used when it is read and interpreted. This means that the level of meaning must be added to the structural level. For this purpose, a certain reference system must be created in order to be able to convert the structures into a meaning. This reference system is called a code. So in the example above you have to know what “warm” means.

However, the transition from syntax to semantics is seldom so straightforward; As a rule, the information is processed via a large number of different codes of increasingly higher semantic levels: Information processing on the structural-syntactic level is carried out on the different semantic levels: The light pulses that hit your retina are registered there by nerve cells ( meaning for the nerve cell), passed on to the brain, brought into a spatial context, recognized as letters, combined into words. During this whole time, nerve impulses (ie structural information) are “shot” from one brain cell to the next, until the terms for “warm”, “now” and “here”, which can only be insufficiently reproduced by words, begin to form in your consciousness which then have a meaning in context: You now know that these words are about the statement that it is warm (and not cold).

Summarized:

  • Structural information is converted into semantics (meaning) in a decoding process.
  • Structural information is gradually converted into other structural information via codes, with meaning for the processing system developing at the different semantic levels.

Pragmatic level of information

This comes closest to the colloquial term information. The statement that it is warm (which we have now interpreted semantically correctly; we know what this message is trying to tell us) has real informational character when we are half asleep thinking about what to wear at noon at twelve after a night of drinking , and our friend keeps us from putting on our turtleneck sweater with the words “it's warm”. The pragmatic information content of the - semantically exactly the same - statement is zero if we are already sitting on the balcony in our T-shirts and sweating. This communication does not offer us anything new and is therefore not informative.

In this context, the term granularity (communication science) describes the qualitative measure of the "accuracy of fit" of information from the perspective of the recipient.

Smalltalk is a type of information exchange in which the semantic information that is obviously exchanged via language is practically no pragmatic information - what is important here are the body signals whose semantics (friendliness, aversion) we recognize and pragmatically (does he / she like me?) can utilize.

In this pragmatic sense is essential criterion of information that they are the subject that receives the information changed which means concretely, the information that can potentially be taken from the subject, that changed.

Summarized:

  • Information makes it possible to reduce uncertainty, but it can also increase uncertainty if it increases in volume, if it is contradicting itself and if it cannot be evaluated in the given time and budget.
  • Information is transferable ; in the form of data or signals.
  • Information is an event that can change the state of the receiver or system. To do this, it must be "understood" by the recipient.

In this pragmatic sense, "information" is a key term in business informatics and the related business administration (information as a production factor , information as an economic good). In short: information is a reduction of uncertainty.

Relationships between the levels

When looking at the phenomenon of information , the four levels have to be considered in context. In order for information to take place, agreements are necessary at all four levels.

The semantic processing (for example the combination of letters to form words) in turn produces syntactic information (namely a sequence of word symbols). Ultimately, the pragmatic level is also defined not least by the fact that it has to create new information of a syntactic nature itself (otherwise the information would have no effect). Due to the close interplay between the semantic decoding process and the development of effects in pragmatics, which in turn generate syntactic information as end and intermediate products, these two levels are sometimes merged into semantopragmatics .

Models

The essence of information is its ability to cause changes in the receiving system. Since there is no recognized uniform theory of “information” so far, but only different models, a clear definition of the term “information” is not yet available, although an unrecognized definition could already lead to a formal description of the experimentation process.

Explanatory approaches for the concept of information come from the humanities and social sciences (semantics, semiotics, philosophy, communication science, etc.) as well as from the natural sciences ( physics , cybernetics , communications engineering , computer science , etc.). The different approaches do not match, but they overlap.

One of the essential differences between humanities and natural science models is that for natural science an exchange of information is already seen in the interaction of subatomic particles (see e.g. the Einstein-Podolsky-Rosen paradox , from which Einstein's classic quotation about a "Spooky action at a distance" arises because here two particles seem to exchange information instantaneously instead of at the speed of light, as Einstein predicted.)

The scientific concept of “information” is closely related to the concept of entropy (i.e. the second law of thermodynamics). This results in numerous consequences, corresponding to the numerous consequences that result from the second law of thermodynamics. (One of the possible consequences is: As an object of science is under information a potential or actual existing usable pattern understood by matter or energy forms of information here is what is in the state of. Systems can be derived for the states of other systems.)

This scientific understanding is in contradiction to the concept of information that originates from the humanities and that dominates everyday language use.

Both the humanities and the concept of “information” in daily use tend towards an understanding in which the concept of “meaning” plays a major role. The “meaning” here is an intrinsic property of information, which also implies the existence of a (potential) recipient for whom the meaning content unfolds.

The common communication models are based on this concept. Thus, most concepts in the humanities as well as the widespread understanding in everyday language use assume that information always has a functional meaning, in contrast to the scientific understanding, in which neither function nor meaning are necessarily constitutive properties of information.

As a term in the mathematical information theory refers to information on the occurrence probabilities of specific sequences of elements (for example, a sequence of letters) of a predetermined amount (for example, the alphabet). This definition turns information into a calculable measure for the probability of future events in a technical system. Claude Elwood Shannon (1948) originally conceived the mathematical theory of information not for the area of ​​human action and human communication, but for the technical optimization of transmission capacities.

In the area of human activity is below information a knowledge (: the result of a process of experience precisely) understood the situation in the respective current importance is attached to and validity. In this context, the talk of “information” or “informing oneself” is associated with the elimination or reduction of uncertainty which occurs through information, clarification, communication, notification or through knowledge of objects and phenomena . Recognizability and novelty are often part of the concept of information.

In the algorithmic information theory a measure was developed with which one can determine the complexity of structures, e.g. B. the complexity of strings. Under certain conditions, this can also be used as a measure of information, which has advantages over Shannon's in some aspects.

Communication model of information

For a long time, understanding the syntactic level was characterized by the sender-receiver model : a sender wants to communicate information to the receiver. To do this, he encodes his information according to certain principles (for example as a sequence of zeros and ones according to the above-mentioned principle) in an information carrier, the recipient evaluates this information carrier, because he also knows the code, and thus receives the information (see also: communication ).

However, there is not always a human sender who wants to tell us something. A typical example is measurement : the physical system, figuratively speaking, doesn't care what people think of it. The aim of the measurement is to transfer information from the system being measured to the person performing the measurement (you measure in order to learn something about the system being measured).

One example is speed measurement using a radar trap : the car has no intention of revealing its speed (and usually neither does the driver). Nevertheless, the police officer gains information about the speed through the measurement. A physical law is used to obtain the information (the Doppler effect ), which was taken up by an engineer to design the device . The police use the device and thus cause information to be generated. The direct generation of information, on the other hand, is delegated to a device . Here, too, the person is the author of the information. The radar measuring device was developed and the measurement results obtained are then automatically displayed, recorded or transmitted in a code specified by the person.

Many animals are also capable of communication - both as senders and receivers. Although this is mainly intended for communication with conspecifics (alarm calls, etc.), it can also be used by humans in some cases.

Summarized:

  • In order for information to be recognizable to humans, matter or energy must have a structure.
  • Syntactically, information corresponds to the probability of occurrence of a particular symbol within a defined decoding scheme
  • In the communication model, information is a spatial or temporal sequence of physical signals that occur with certain probabilities or frequencies.
  • The information content of a message results from the number of yes / no options for which one of the values ​​is specified in the message.

Information transport, creation and destruction

It is interesting that information that is bound to matter as an information carrier can be transmitted on or by electromagnetic waves . Since this information is massless, it can in principle be transported at the speed of light . Finally, the information can be linked back to the structure of matter. An example of such a transmission process is the fax . The information from a specific document is transported over long distances at the speed of light and transferred to a second document with exactly the same information content at the destination.

More generally: An information carrier is necessary to transport information.

Can information be passed on without loss? This is the case when copying software because technical mechanisms (redundant codes / checksums) ensure this. Information cannot generally be passed on without becoming less. The extent of the loss depends on the physical boundary conditions. According to Shannon, no more information can be taken from a channel during a transmission than is put in on the transmitter side. When information is passed on or copied, however, it is not actually duplicated; it is then only available redundantly .

In a thermodynamically closed system, information is ultimately destroyed, at the latest when the universe dies from heat . In a thermodynamically open system, information can be passed on, information-bearing structures can even arise spontaneously. Examples are a large number of theoretically and experimentally investigated dissipative structures . In particular, spin systems (spin = angular momentum of atomic and subatomic particles), especially the so-called spin glasses or Ising models , have been studied very often, not least because of their relevance to the theory of neural networks. Many experiments show that structures can arise spontaneously in Ising glasses which, due to the quantized nature of the spin, can even be interpreted as information already available in digitized form. B. contains the formation conditions of the structure in coded form.

The term in various sciences / disciplines

Information is a widely used and difficult to define term. Various studies ( structural and Humanities ) consider the information as their field of work, including the computer science , the information theory and information science , the communications technology , the information economics and semiotics , it may be a mathematical , philosophical or empirical be (about sociological) term.

Only recently have efforts been made to combine the individual approaches and arrive at a generally applicable information concept. Corresponding literature can currently usually be found under the keyword philosophy (e.g. in the area of epistemology ). For the time being it is not possible to speak of a unified, generally accepted theory of information.

In common parlance and in some sciences (semiotics, information science) “information ” is equated with “ meaning ” or “transferred knowledge”. Another view of the term, which is of great practical importance today in computer technology, for example , comes from communications technology. The leading theory there is that of Claude Shannon ; he looks at the statistical aspects of the characters in a code that represents information. For Shannon, the meaning of the information is only implicitly included in the probabilities of the characters used, which can ultimately only be determined with the help of a person, since only a person is able to consciously grasp the meaning of a code and thereby meaningful from not meaningful Code can distinguish. The immediate goal of his considerations is the optimal transmission of information in a communication channel ( telephony , radio technology ).

The term information and other terms from information theory are often used in everyday language and in the natural sciences in a metaphorical way. A direct adoption of the term information in scientific theories, as it is used in engineering , is rejected by some scientific theorists as inadmissible. For example, the philosopher of science Wolfgang Stegmüller warned against a resurgence of neovitalism through inappropriate use of information-theoretical terms in biology . However, it cannot be ruled out that in the future the scientific concept of structure and the concept of information can be traced back to one another. For example, neuroinformatics and computational neuroscience investigate the relationship between the neuronal structures of the brain and its ability to process information.

To conclude, the individual subject and research directions should have their say, each with its own understanding of the information. The respective approach on the different levels described above between pure syntax and pragmatics becomes clear, in some cases with a special emphasis on the transport character of information.

semiotics

The semiotics defined data as potential information . In semiotics today, data is classified on the sigmatic level. In older literature, they are often still defined as purpose-oriented knowledge, i.e. purpose-oriented data that expand knowledge.

Information science

The Information Science uses the concept of information similar to the semiotic approach. For them, the terms knowledge and information are of central importance. Information is knowledge transfer or “knowledge in action”. It arises only at certain points in this sense - if required for a concrete problem-solving knowledge (a certain knowledge unit) / provided. This knowledge unit is transferred as 'information' from one knowledge pool to another, for example from a database into a person's knowledge pool. Knowledge is represented internally (see also knowledge representation ), information is presented - for a better understanding for the information seeker (see also information visualization ).

Documentation and regulatory theory

Wilhelm Gaus writes in his work Documentation and Ordnungslehre that information can be viewed from different aspects.

  1. Structure = structure approach
  2. Knowledge = knowledge approach
  3. Signal = signal approach
  4. Message = message approach
  5. understood message = meaning approach
  6. Knowledge increase = effect approach
  7. Process = process approach

Antitrust law

From an antitrust perspective, information can be defined as "any circumstance that enables the perceiver to gain knowledge". An exchange of information can be "any direct or indirect flow of information between companies about market events", whereby market events include "all activities, events, processes and interdependencies that affect, affect or can influence the nature of a market".

Information as an economic asset

Information can be viewed as an economic good, as information can be produced in the company through the use of other production factors (people, computers, software, communication, etc.), or it can be purchased from outside. Information thus has a value that can be traded. The value results from the use of the information and the costs of production, provision and forwarding. The problem here is that the potential buyer does not always know the value of the information in advance and can sometimes only evaluate it after he has acquired it (so-called information paradox ). Even the desired trade in information is afflicted with the problem of asymmetrical information .

Furthermore, information can also be understood as a production factor. Information is therefore not only used for consumption, but can also be used productively.

Information as change

According to the work of the Berlin computer scientist Peter Rüdiger : "Information is a change in concrete quantity and duration."

A definition of the information about change means a description of the information about physical impact. If a simple change is viewed as a mathematical element that brings about a change of state, then it can be proven that a set of such elements, which bring about a change of state on the same "object" and have properties such as coherence and repeatability, represent a mathematical group that is called information is declared with respect to the object. This group allows the length to be determined , which can be used for optimizations, because since changes are the result of physical effects, the principle of variation with the least effect also applies .

Another mathematical description based on the nature of change is Jan Kåhre's description : The Law of Diminishing Information .

Movement is also change. A (further) definition of the information about change therefore takes place via movement difference (information movement) and difference movement (resting potential): "Information only exists in movement, which is always a complementary, relative movement".

Related terms

message

Information is also used synonymously for news, information, instruction , clarification , and in some cases also for media such as newspaper articles, websites , e-mails, telephone calls, reports (quarterly, project, annual reports), prospectuses and brochures, timetables, weather reports and much more - but they are only the “carriers of information”, not the information itself. These examples show the widespread use and fundamental meaning of the term information in almost all areas of (life).

communication

See also: information and communication

(Human) communication is also closely related : Communicability is an essential property of information, and all communication requires information.

Data

Data are only representations / information about facts and processes that exist in the form of certain characters / symbols on certain data carriers. They can become “information” (in the case of people through cognitive activities of the recipient), purpose-related knowledge that is required when acting with regard to set goals. This is done by semantizing perceived data “intra-individually” (by the respective individual) and carrying out further operations (such as inferences). Using the same data, different information can be obtained. The terms information and data are therefore closely related.

Knowledge

The concept of information is closely linked to issues relating to knowledge . In particular, this includes the problem of defining complexity , which can be described using the algorithmic depth of an information-processing process. This also includes considerations about the difference between chance and order as well as the concept of distinguishability and relevance .

see also: knowledge management , intellectual property

See also

literature

Textbooks and non-fiction books

Special topics

  • Christoph Arndt: Information Measures - Information and its Description in Science and Engineering . In: Signals and Communication Technology . Springer, Berlin 2004, ISBN 3-540-40855-X .
  • Wilhelm Gaus : Documentation and order theory - theory and practice of information retrieval . In: eXamen.press . 5th edition. Springer, Berlin 2005, ISBN 3-540-27518-5 .
  • Andreas Holzinger: Basic knowledge of IT / computer science. Volume 1: Information Technology. Vogel, Würzburg 2002. ISBN 3-8023-1897-8
  • Martin Werner: Information and Coding. Vieweg + Teubner, Wiesbaden 2008. ISBN 978-3-8348-0232-3

Information theory

  • Herbert Klimant, Rudi Piotraschke, Dagmar Schönfeld: Information and coding theory . Teubner Verlag., Wiesbaden / Stuttgart 2003, ISBN 3-519-23003-8 .
  • Holger Lyre: Information Theory . Wilhelm Fink Verlag., Paderborn / Munich 2002, ISBN 3-7705-3446-8 .
  • Keith Devlin : Infos and Infone. The mathematical structure of the information . Birkhäuser Verlag., Basel / Switzerland 1996, ISBN 3-7643-2703-0 .
  • Jan Kåhre: The Mathematical Theory of Information , Springer, Berlin 2002, ISBN 1-4020-7064-0 .
  • Peter Rechenberg : On the concept of information in information theory , in: Informatik-Spektrum (2003) 26: 317 - 326.

Systems theory

  • Norbert Bischof: Structure and meaning. An introduction to systems theory for psychologists, biologists and social scientists for self-study and for group lessons. 2nd, corrected edition. Bern: Hans Huber, 1998. ISBN 3-456-83080-7 .

Popular science books for information

philosophy

See also the bibliography of Floridi 2005 under web links

Web links

Commons : Information  - album with pictures, videos and audio files
Wiktionary: Information  - explanations of meanings, word origins, synonyms, translations
Wikibooks: Information  - learning and teaching materials

Individual evidence

  1. a b c Information science: Definition: Information. uni-saarland.de, accessed on February 21, 2017 .
  2. ^ Gabler Wirtschaftslexikon: Information
  3. ^ Rechenberg, Peter .: Computer science manual . 4th, updated and exp. Hanser, Munich 2006, ISBN 3-446-40185-7 , pp. 214 .
  4. ^ Saur, KG, publisher: Basic library knowledge . De Gruyter Saur, Berlin 2016, ISBN 978-3-11-032145-6 , pp. 6 .
  5. a b Duden spelling : Keyword information, including synonyms
  6. a b c Verlag KG Saur 2004: Basics of practical information and documentation p. 684 chap. E1 Information in Computer Science - with further definitions and dictionary references
  7. woxikon
  8. ^ Duden: Information as a synonym for data
  9. woxikon: individual meanings and terms for information
  10. DUDEN Informatik, ISBN 3-411-05232-5
  11. ^ John Bogart, local editor of the US newspaper Sun , 1880, quoted in Walther von La Roche : Introduction to practical journalism, Berlin 2008, p. 71
  12. ^ Definition of the information Bevier FF, bussole InformationsVerlag 2000/2005
  13. 7-step evaluation (via index at the end of the PDF; 3.4 MB) Bevier FF, bussole InformationsVerlag, 1999/2012
  14. ^ Max Born, Albert Einstein: Albert Einstein, Max Born. Correspondence 1916–1955. Munich (Nymphenburger) 1955, p. 210.
  15. Werner Zorn: "About the vague use of basic terms in computer science" in the conference volume for the "19th DFN working conference on communication networks" in Düsseldorf 2004, edited by Knop, Haverkamp, ​​Jessen, GI Lector Notes in Informatics, 2005, p. 13-37
  16. W. Stegmüller: "Main currents of contemporary philosophy", Vol. 2
  17. ^ Wilhelm Gaus: Documentation and Ordnungslehre: Theory and Practice of Information Retrieval . 5th, revised edition. Springer, Berlin 2005, ISBN 978-3-540-23818-8 , pp. 29-25 .
  18. Manuel Thomas: Limits of the horizontal exchange of information in German and European cartel law . In: International Göttingen Law Series . tape 83 . Cuvillier Verlag, Göttingen 2018, ISBN 978-3-7369-9866-7 , p. 32-33 .
  19. Manuel Thomas: Limits of the horizontal exchange of information in German and European cartel law . In: International Göttingen Law Series . tape 83 . Cuvillier Verlag, Göttingen 2018, ISBN 978-3-7369-9866-7 , p. 45-46 .
  20. Bussole.de: The definition of information and the consequences
  21. matheorie.de: The Mathematical Theory of Information
  22. Jerg Haas: The Cybernetics of Nature: Complementarity , ISBN 3-8311-1019-0