Latency

Meaning – The term latency, refers to the time interval between the instant at which an instruction control unit initiates a call for data and the instant at which the actual transfer of the data starts.

Latency is generally measured in milliseconds (ms) and is unavoidable due to the way networks communicate with each other. It depends on several aspects of a network and can vary if any of them are changed.

Network Latency can be affected by either of the following points –

  1. Transmission medium: The physical path between the start point and the endpoint.
  2. Propagation: The further apart two nodes are the more latency there is as latency is dependent on the distance between the two communicating nodes.
  3. Routers: The efficiency in which routers process incoming data has a direct impact on latency.
  4. Storage delays: Accessing stored data can increase latency as the storage network may take time to process and return information.

Example of usage“Latency can be reduced by addressing the aforementioned components and ensuring that they are working correctly.”