Bit time

from Wikipedia, the free encyclopedia

In large computer networks, the bit time describes the time in which exactly one bit can be sent from the transmitter to the receiver. It corresponds to the bit interval , which indicates the time interval between two bits.

So that the receiver can read a binary 1 or 0, there must be a fixed interval in whose segments the logical state of a bit can be determined. In the simplest case, the logic state of a bit also corresponds to the level on the physical line. But there are also other digitally enhanced line codings .

The transmission time corresponds to the product of the number of bits sent and the bit interval for the technology in question. In a 10 Mbit Ethernet, the bit interval is 100 ns. 1 byte is 8 bits , so the transmission time for 1 byte would be 800 ns. The minimum frame size of the smallest 10BASE-T frame allowed for CSMA / CD to work is 64 bytes. A transmission would take 51,200 ns (51.2 microseconds) (64 octets × 800 ns).

With all Ethernet data transmission rates of 1000 Mbit / s or less, the transmission can never be slower than the time channel unit ( slot time ). The time channel unit (for 10 Mbit / s and 100 Mbit / s Ethernet) is 512 bit intervals - 51.2 microseconds. For 1000 Mbit / s Ethernet it is 4096 bit intervals (512 octets) - 409,600 microseconds.