Bit time is a concept in computer networking. It is defined as the time it takes for one bit to be ejected from a network interface controller (NIC) operating at some predefined standard speed, such as 10 Mbit/s. The time is measured between the time the logical link control sublayer receives the instruction from the operating system until the bit actually leaves the NIC. The bit time has nothing to do with the time it takes for a bit to travel on the network medium but has to do with the internals of the NIC.
To calculate the bit time at which a NIC ejects bits, use the following:
bit time = 1 / NIC speed
To calculate the bit time for a 10 Mbit/s NIC, use the formula as follows:
bit time = 1 / (10 * 10^6) = 10^-7 = 100 * 10^-9 = 100 nanoseconds
The bit time for a 10 Mbit/s NIC is 100 nanoseconds. That is, a 10 Mbit/s NIC can eject 1 bit every 0.1 microsecond (100 nanoseconds = 0.1 microseconds).
Bit time is distinctively different from slot time, which is the time taken for a pulse to travel through the longest permitted length of network medium.