# Channel capacity

The channel capacity is part of the information-theoretical description of a transmission channel . It indicates the highest bit rate with which information can be transmitted over a channel without errors .

Claude Shannon and Ralph Hartley showed in the form of the Shannon-Hartley law that the theoretical channel capacity can be approximated by suitable coding .

## definition

General transmission channel

The channel capacity C for a discrete, memory-free channel is the supremum of all symbols which can be transmitted error-free in a period of time . Distinction is made between the information content of the transmitted symbols X and the received symbols Y distinguished. The difference is: ${\ displaystyle \ tau}$

1. due to interferences which affect the symbols on the transmission channel and falsify them,
2. due to the fact that information sent on the transmission channel can be lost.

Mathematically, the entropy function I (), which describes the mutual information ( transinformation ), can be expressed for these two cases as:

${\ displaystyle I (X; Y) = H (X) -H (X | Y) = H (Y) -H (Y | X)}$

The channel capacitance C can be given as the maximum:

${\ displaystyle C = {\ frac {1} {\ tau}} \ max _ {p (X)} I (X; Y)}$

## Types of transmission channels

The channel capacity depends on the type of transmission channel. The following shows the channel capacity for important models of transmission channels.

### Binary symmetrical channel

The binary transmission channel represents a discrete-value transmission channel, so its maximum channel capacity is limited to one bit. The channel can only accept the two symbols (states) which are designated with 0 or 1 . The channel capacity can therefore only be in the interval 0 to 1 with this channel model.

The channel capacity of this binary symmetric channel, also known as Binary Symmetric Channel (BSC), is:

${\ displaystyle C = 1-H (p) = 1 + p \ cdot \ log _ {2} p + (1-p) \ cdot \ log _ {2} (1-p),}$

where H (p) represents the binary entropy function:

${\ displaystyle H (p) = - (p \ cdot \ log _ {2} p + (1-p) \ cdot \ log _ {2} (1-p))}$

and p is the probability of error .

For the extreme values ​​of the error probability of p = 1 or p = 0, the capacity is one and therefore maximum. In the first case, the carry channel represents an inverter . In the case of p = 0, there is an error-free binary transmission channel. For p = 0.5, the channel capacitance is C = 0, i.e. minimal. In this case the BSC becomes an ideal source of noise and no information transfer can take place.

### Binary extinction channel

The binary cancellation channel also represents a discrete-value transmission channel, its channel capacity

${\ displaystyle C = 1-p}$

with the error probability p . The channel capacity is maximum for this channel for p = 0, where no cancellation takes place. It is minimal for p = 1 where all information is deleted and no information transfer is possible.

### AWGN channel

The AWGN channel represents a value-continuous transmission channel which has additive white Gaussian noise (AWGN) as a disturbance variable . Due to the continuous value channel property, there can in principle be an infinite number of symbols, which, however, may no longer be reliably differentiated from one another due to the interference variable. The resulting channel capacitance C of the AWGN channel is according to the Shannon-Hartley law :

${\ displaystyle C = B \ cdot \ log _ {2} \ left (1 + {\ frac {S} {N}} \ right)}$

B is the bandwidth in Hz and S / N is the signal-to-noise ratio , abbreviated as SNR, between the information to be transmitted and the additive white Gaussian noise occurring on the channel. For maximum channel capacity, it is necessary that the SNR becomes maximum. In the borderline case of a channel with S / N → ∞, an infinitely high channel capacity can also be achieved due to the fact that there is an infinitely large number of symbols.

## Demarcation

In order to achieve the transmission rate, different symbols are transmitted over the channel in chronological order. This time sequence of different symbols results in a symbol rate that depends on the frequency and spectrally occupies a certain bandwidth . The transmission rate R then results from the symbol rate and the value of how many bits of information are transported per symbol. The value of how many bits are transmitted per symbol depends directly on the channel capacity and thus on the respective channel.

## Literature sources

• André Neubauer, Jürgen Freudenberger, Volker Kühn: Coding Theory. Algorithms, Architectures, and Applications . John Wiley & Son, Chichester 2007, ISBN 978-0-470-02861-2 .
• Martin Werner: Information and Coding. Basics and Applications, 2nd edition, Vieweg + Teubner Verlag, Wiesbaden 2008, ISBN 978-3-8348-0232-3 .
• Markus Hufschmid: Information and communication . Basics of information transfer, Vieweg and Teubner, Wiesbaden 2006, ISBN 3-8351-0122-6 .
• Martin Bossert: Channel Coding. 3rd revised edition, Oldenbourg Verlag, Munich 2013, ISBN 978-3-486-72128-7 .
• Martin Werner: communications engineering. An introduction to all courses, 7th edition, Vieweg + Teubner Verlag, Wiesbaden 2010, ISBN 978-3-8348-0905-6 .