Network latency refers to the time it takes for data to travel from the source to its destination across a network. This time delay can be influenced by several factors, including the physical distance the data must travel, the quality of the network connection, and any processing delays along the way. A lower latency means a quicker response time, which is crucial for applications requiring real-time data transmission, such as video conferencing or online gaming.
The other choices focus on different aspects of networking—data capacity, physical distance, and transfer speed—but they do not accurately define latency itself. While they are related concepts in data transmission, they do not represent the time it takes for the data to be transmitted, which is the essence of latency.